WorldWideScience

Sample records for canica model-based extraction

  1. Characterization and Performance of the Cananea Near-infrared Camera (CANICA)

    Science.gov (United States)

    Devaraj, R.; Mayya, Y. D.; Carrasco, L.; Luna, A.

    2018-05-01

    We present details of characterization and imaging performance of the Cananea Near-infrared Camera (CANICA) at the 2.1 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA has a HAWAII array with a HgCdTe detector of 1024 × 1024 pixels covering a field of view of 5.5 × 5.5 arcmin2 with a plate scale of 0.32 arcsec/pixel. The camera characterization involved measuring key detector parameters: conversion gain, dark current, readout noise, and linearity. The pixels in the detector have a full-well-depth of 100,000 e‑ with the conversion gain measured to be 5.8 e‑/ADU. The time-dependent dark current was estimated to be 1.2 e‑/sec. Readout noise for correlated double sampled (CDS) technique was measured to be 30 e‑/pixel. The detector shows 10% non-linearity close to the full-well-depth. The non-linearity was corrected within 1% levels for the CDS images. Full-field imaging performance was evaluated by measuring the point spread function, zeropoints, throughput, and limiting magnitude. The average zeropoint value in each filter are J = 20.52, H = 20.63, and K = 20.23. The saturation limit of the detector is about sixth magnitude in all the primary broadbands. CANICA on the 2.1 m OAGH telescope reaches background-limited magnitudes of J = 18.5, H = 17.6, and K = 16.0 for a signal-to-noise ratio of 10 with an integration time of 900 s.

  2. Modeling and prediction of extraction profile for microwave-assisted extraction based on absorbed microwave energy.

    Science.gov (United States)

    Chan, Chung-Hung; Yusoff, Rozita; Ngoh, Gek-Cheng

    2013-09-01

    A modeling technique based on absorbed microwave energy was proposed to model microwave-assisted extraction (MAE) of antioxidant compounds from cocoa (Theobroma cacao L.) leaves. By adapting suitable extraction model at the basis of microwave energy absorbed during extraction, the model can be developed to predict extraction profile of MAE at various microwave irradiation power (100-600 W) and solvent loading (100-300 ml). Verification with experimental data confirmed that the prediction was accurate in capturing the extraction profile of MAE (R-square value greater than 0.87). Besides, the predicted yields from the model showed good agreement with the experimental results with less than 10% deviation observed. Furthermore, suitable extraction times to ensure high extraction yield at various MAE conditions can be estimated based on absorbed microwave energy. The estimation is feasible as more than 85% of active compounds can be extracted when compared with the conventional extraction technique. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Annotation-based feature extraction from sets of SBML models.

    Science.gov (United States)

    Alm, Rebekka; Waltemath, Dagmar; Wolfien, Markus; Wolkenhauer, Olaf; Henkel, Ron

    2015-01-01

    Model repositories such as BioModels Database provide computational models of biological systems for the scientific community. These models contain rich semantic annotations that link model entities to concepts in well-established bio-ontologies such as Gene Ontology. Consequently, thematically similar models are likely to share similar annotations. Based on this assumption, we argue that semantic annotations are a suitable tool to characterize sets of models. These characteristics improve model classification, allow to identify additional features for model retrieval tasks, and enable the comparison of sets of models. In this paper we discuss four methods for annotation-based feature extraction from model sets. We tested all methods on sets of models in SBML format which were composed from BioModels Database. To characterize each of these sets, we analyzed and extracted concepts from three frequently used ontologies, namely Gene Ontology, ChEBI and SBO. We find that three out of the methods are suitable to determine characteristic features for arbitrary sets of models: The selected features vary depending on the underlying model set, and they are also specific to the chosen model set. We show that the identified features map on concepts that are higher up in the hierarchy of the ontologies than the concepts used for model annotations. Our analysis also reveals that the information content of concepts in ontologies and their usage for model annotation do not correlate. Annotation-based feature extraction enables the comparison of model sets, as opposed to existing methods for model-to-keyword comparison, or model-to-model comparison.

  4. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  5. Skeleton extraction based on the topology and Snakes model

    Directory of Open Access Journals (Sweden)

    Yuanxue Cai

    Full Text Available A new skeleton line extraction method based on topology and flux is proposed by analyzing the distribution characteristics of the gradient vector field in the Snakes model. The distribution characteristics of the skeleton line are accurately obtained by calculating the eigenvalues of the critical points and the flux of the gradient vector field. Then the skeleton lines can be effectively extracted. The results also show that there is no need for the pretreatment or binarization of the target image. The skeleton lines of complex gray images such as optical interference patterns can be effectively extracted by using this method. Compared to traditional methods, this method has many advantages, such as high extraction accuracy and fast processing speed. Keywords: Skeleton, Snakes model, Topology, Photoelasticity image

  6. A hybrid model based on neural networks for biomedical relation extraction.

    Science.gov (United States)

    Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Zhang, Shaowu; Sun, Yuanyuan; Yang, Liang

    2018-05-01

    Biomedical relation extraction can automatically extract high-quality biomedical relations from biomedical texts, which is a vital step for the mining of biomedical knowledge hidden in the literature. Recurrent neural networks (RNNs) and convolutional neural networks (CNNs) are two major neural network models for biomedical relation extraction. Neural network-based methods for biomedical relation extraction typically focus on the sentence sequence and employ RNNs or CNNs to learn the latent features from sentence sequences separately. However, RNNs and CNNs have their own advantages for biomedical relation extraction. Combining RNNs and CNNs may improve biomedical relation extraction. In this paper, we present a hybrid model for the extraction of biomedical relations that combines RNNs and CNNs. First, the shortest dependency path (SDP) is generated based on the dependency graph of the candidate sentence. To make full use of the SDP, we divide the SDP into a dependency word sequence and a relation sequence. Then, RNNs and CNNs are employed to automatically learn the features from the sentence sequence and the dependency sequences, respectively. Finally, the output features of the RNNs and CNNs are combined to detect and extract biomedical relations. We evaluate our hybrid model using five public (protein-protein interaction) PPI corpora and a (drug-drug interaction) DDI corpus. The experimental results suggest that the advantages of RNNs and CNNs in biomedical relation extraction are complementary. Combining RNNs and CNNs can effectively boost biomedical relation extraction performance. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling

    Science.gov (United States)

    Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.

    The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.

  8. COSMO-RS-based extractant screening for phenol extraction as model system

    NARCIS (Netherlands)

    Burghoff, B.; Goetheer, E.L.V.; Haan, A.B. de

    2008-01-01

    The focus of this investigation is the development of a fast and reliable extractant screening approach. Phenol extraction is selected as the model process. A quantum chemical conductor-like screening model for real solvents (COSMO-RS) is combined with molecular design considerations. For this

  9. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    Science.gov (United States)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  10. TLS-Based Feature Extraction and 3D Modeling for Arch Structures

    Directory of Open Access Journals (Sweden)

    Xiangyang Xu

    2017-01-01

    Full Text Available Terrestrial laser scanning (TLS technology is one of the most efficient and accurate tools for 3D measurement which can reveal surface-based characteristics of objects with the aid of computer vision and programming. Thus, it plays an increasingly important role in deformation monitoring and analysis. Automatic data extraction and high efficiency and accuracy modeling from scattered point clouds are challenging issues during the TLS data processing. This paper presents a data extraction method considering the partial and statistical distribution of the point clouds scanned, called the window-neighborhood method. Based on the point clouds extracted, 3D modeling of the boundary of an arched structure was carried out. The ideal modeling strategy should be fast, accurate, and less complex regarding its application to large amounts of data. The paper discusses the accuracy of fittings in four cases between whole curve, segmentation, polynomial, and B-spline. A similar number of parameters was set for polynomial and B-spline because the number of unknown parameters is essential for the accuracy of the fittings. The uncertainties of the scanned raw point clouds and the modeling are discussed. This process is considered a prerequisite step for 3D deformation analysis with TLS.

  11. Open critical area model and extraction algorithm based on the net flow-axis

    International Nuclear Information System (INIS)

    Wang Le; Wang Jun-Ping; Gao Yan-Hong; Xu Dan; Li Bo-Bo; Liu Shi-Gang

    2013-01-01

    In the integrated circuit manufacturing process, the critical area extraction is a bottleneck to the layout optimization and the integrated circuit yield estimation. In this paper, we study the problem that the missing material defects may result in the open circuit fault. Combining the mathematical morphology theory, we present a new computation model and a novel extraction algorithm for the open critical area based on the net flow-axis. Firstly, we find the net flow-axis for different nets. Then, the net flow-edges based on the net flow-axis are obtained. Finally, we can extract the open critical area by the mathematical morphology. Compared with the existing methods, the nets need not to divide into the horizontal nets and the vertical nets, and the experimental results show that our model and algorithm can accurately extract the size of the open critical area and obtain the location information of the open circuit critical area. (interdisciplinary physics and related areas of science and technology)

  12. Model-based Bayesian signal extraction algorithm for peripheral nerves

    Science.gov (United States)

    Eggers, Thomas E.; Dweiri, Yazan M.; McCallum, Grant A.; Durand, Dominique M.

    2017-10-01

    Objective. Multi-channel cuff electrodes have recently been investigated for extracting fascicular-level motor commands from mixed neural recordings. Such signals could provide volitional, intuitive control over a robotic prosthesis for amputee patients. Recent work has demonstrated success in extracting these signals in acute and chronic preparations using spatial filtering techniques. These extracted signals, however, had low signal-to-noise ratios and thus limited their utility to binary classification. In this work a new algorithm is proposed which combines previous source localization approaches to create a model based method which operates in real time. Approach. To validate this algorithm, a saline benchtop setup was created to allow the precise placement of artificial sources within a cuff and interference sources outside the cuff. The artificial source was taken from five seconds of chronic neural activity to replicate realistic recordings. The proposed algorithm, hybrid Bayesian signal extraction (HBSE), is then compared to previous algorithms, beamforming and a Bayesian spatial filtering method, on this test data. An example chronic neural recording is also analyzed with all three algorithms. Main results. The proposed algorithm improved the signal to noise and signal to interference ratio of extracted test signals two to three fold, as well as increased the correlation coefficient between the original and recovered signals by 10-20%. These improvements translated to the chronic recording example and increased the calculated bit rate between the recovered signals and the recorded motor activity. Significance. HBSE significantly outperforms previous algorithms in extracting realistic neural signals, even in the presence of external noise sources. These results demonstrate the feasibility of extracting dynamic motor signals from a multi-fascicled intact nerve trunk, which in turn could extract motor command signals from an amputee for the end goal of

  13. A New Method of Chinese Address Extraction Based on Address Tree Model

    Directory of Open Access Journals (Sweden)

    KANG Mengjun

    2015-01-01

    Full Text Available Address is a spatial location encoding method of individual geographical area. In China, address planning is relatively backward due to the rapid development of the city, resulting in the presence of large number of non-standard address. The space constrain relationship of standard address model is analyzed in this paper and a new method of standard address extraction based on the tree model is proposed, which regards topological relationship as consistent criteria of space constraints. With this method, standard address can be extracted and errors can be excluded from non-standard address. Results indicate that higher math rate can be obtained with this method.

  14. Improved External Base Resistance Extraction for Submicrometer InP/InGaAs DHBT Models

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Krozer, Viktor; Nodjiadjim, Virginie

    2011-01-01

    An improved direct parameter extraction method is proposed for III–V heterojunction bipolar transistor (HBT) external base resistance $R_{\\rm bx}$ extraction from forward active $S$-parameters. The method is formulated taking into account the current dependence of the intrinsic base–collector cap......An improved direct parameter extraction method is proposed for III–V heterojunction bipolar transistor (HBT) external base resistance $R_{\\rm bx}$ extraction from forward active $S$-parameters. The method is formulated taking into account the current dependence of the intrinsic base...... factor given as the ratio of the emitter to the collector area. The determination of the parameters $I_{p}$ and $X_{0}$ from experimental $S$-parameters is described. The method is applied to high-speed submicrometer InP/InGaAs DHBT devices and leads to small-signal equivalent circuit models, which...

  15. Context-Specific Metabolic Model Extraction Based on Regularized Least Squares Optimization.

    Directory of Open Access Journals (Sweden)

    Semidán Robaina Estévez

    Full Text Available Genome-scale metabolic models have proven highly valuable in investigating cell physiology. Recent advances include the development of methods to extract context-specific models capable of describing metabolism under more specific scenarios (e.g., cell types. Yet, none of the existing computational approaches allows for a fully automated model extraction and determination of a flux distribution independent of user-defined parameters. Here we present RegrEx, a fully automated approach that relies solely on context-specific data and ℓ1-norm regularization to extract a context-specific model and to provide a flux distribution that maximizes its correlation to data. Moreover, the publically available implementation of RegrEx was used to extract 11 context-specific human models using publicly available RNAseq expression profiles, Recon1 and also Recon2, the most recent human metabolic model. The comparison of the performance of RegrEx and its contending alternatives demonstrates that the proposed method extracts models for which both the structure, i.e., reactions included, and the flux distributions are in concordance with the employed data. These findings are supported by validation and comparison of method performance on additional data not used in context-specific model extraction. Therefore, our study sets the ground for applications of other regularization techniques in large-scale metabolic modeling.

  16. A novel airport extraction model based on saliency region detection for high spatial resolution remote sensing images

    Science.gov (United States)

    Lv, Wen; Zhang, Libao; Zhu, Yongchun

    2017-06-01

    The airport is one of the most crucial traffic facilities in military and civil fields. Automatic airport extraction in high spatial resolution remote sensing images has many applications such as regional planning and military reconnaissance. Traditional airport extraction strategies usually base on prior knowledge and locate the airport target by template matching and classification, which will cause high computation complexity and large costs of computing resources for high spatial resolution remote sensing images. In this paper, we propose a novel automatic airport extraction model based on saliency region detection, airport runway extraction and adaptive threshold segmentation. In saliency region detection, we choose frequency-tuned (FT) model for computing airport saliency using low level features of color and luminance that is easy and fast to implement and can provide full-resolution saliency maps. In airport runway extraction, Hough transform is adopted to count the number of parallel line segments. In adaptive threshold segmentation, the Otsu threshold segmentation algorithm is proposed to obtain more accurate airport regions. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the extraction of the airport.

  17. A Waterline Extraction Method from Remote Sensing Image Based on Quad-tree and Multiple Active Contour Model

    Directory of Open Access Journals (Sweden)

    YU Jintao

    2016-09-01

    Full Text Available After the characteristics of geodesic active contour model (GAC, Chan-Vese model(CV and local binary fitting model(LBF are analyzed, and the active contour model based on regions and edges is combined with image segmentation method based on quad-tree, a waterline extraction method based on quad-tree and multiple active contour model is proposed in this paper. Firstly, the method provides an initial contour according to quad-tree segmentation. Secondly, a new signed pressure force(SPF function based on global image statistics information of CV model and local image statistics information of LBF model has been defined, and then ,the edge stopping function(ESF is replaced by the proposed SPF function, which solves the problem such as evolution stopped in advance and excessive evolution. Finally, the selective binary and Gaussian filtering level set method is used to avoid reinitializing and regularization to improve the evolution efficiency. The experimental results show that this method can effectively extract the weak edges and serious concave edges, and owns some properties such as sub-pixel accuracy, high efficiency and reliability for waterline extraction.

  18. Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.

    Science.gov (United States)

    Pang, Xufang; Song, Zhan; Xie, Wuyuan

    2013-01-01

    3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.

  19. Academic Activities Transaction Extraction Based on Deep Belief Network

    Directory of Open Access Journals (Sweden)

    Xiangqian Wang

    2017-01-01

    Full Text Available Extracting information about academic activity transactions from unstructured documents is a key problem in the analysis of academic behaviors of researchers. The academic activities transaction includes five elements: person, activities, objects, attributes, and time phrases. The traditional method of information extraction is to extract shallow text features and then to recognize advanced features from text with supervision. Since the information processing of different levels is completed in steps, the error generated from various steps will be accumulated and affect the accuracy of final results. However, because Deep Belief Network (DBN model has the ability to automatically unsupervise learning of the advanced features from shallow text features, the model is employed to extract the academic activities transaction. In addition, we use character-based feature to describe the raw features of named entities of academic activity, so as to improve the accuracy of named entity recognition. In this paper, the accuracy of the academic activities extraction is compared by using character-based feature vector and word-based feature vector to express the text features, respectively, and with the traditional text information extraction based on Conditional Random Fields. The results show that DBN model is more effective for the extraction of academic activities transaction information.

  20. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  1. Extraction of Glycyrrhizic Acid from Glycyrrhiza uralensis Using Ultrasound and Its Process Extraction Model

    Directory of Open Access Journals (Sweden)

    Jiangqing Liao

    2016-10-01

    Full Text Available This work focused on the intensification of extraction process of glycyrrhizic acid (GA from Glycyrrhiza uralensis using ultrasound-assisted extraction (UAE method. Various process parameters such as ultrasonic power, ultrasonic frequency, extraction temperature, and extraction time which affect the extraction yield were optimized. The results showed that all process parameters had exhibited significant influences on the GA extraction. The highest GA yield of 217.7 mg/g was obtained at optimized parameters of 125 W, 55 kHz, 25 °C, and 10 min. Furthermore, the extraction kinetics model of this process was also investigated based on Fick’s first law available in the literature. Kinetic parameters such as equilibrium concentration (Ce and integrated influence coefficient (λ for different ultrasonic powers, ultrasonic frequencies, and extraction temperatures were predicted. Model validations were done successfully with the average of relative deviation between 0.96% and 4.36% by plotting experimental and predicted values of concentration of GA in extract. This indicated that the developed extraction model could reflect the effectiveness of the extraction of GA from Glycyrrhiza uralensis and therefore serve as the guide for comprehending other UAE process.

  2. Intrinsic carrier mobility extraction based on a new quasi-analytical model for graphene field-effect transistors

    International Nuclear Information System (INIS)

    Wang, Shaoqing; Jin, Zhi; Muhammad, Asif; Peng, Songang; Huang, Xinnan; Zhang, Dayong; Shi, Jingyuan

    2016-01-01

    The most common method of mobility extraction for graphene field-effect transistors is proposed by Kim. Kim’s method assumes a constant mobility independent of carrier density and gets the mobility by fitting the transfer curves. However, carrier mobility changes with the carrier density, leading to the inaccuracy of Kim’s method. In our paper, a new and more accurate method is proposed to extract mobility by fitting the output curves at a constant gate voltage. The output curves are fitted using several kinds of current–voltage models. Besides the models in the literature, we present a modified model, which takes into account not only the quantum capacitance, contact resistance, but also a modified drift velocity-field relationship. Comparing with the other models, this new model can fit better with our experimental data. The dependence of carrier intrinsic mobility on carrier density is obtained based on this model. (paper)

  3. EVALUATION OFAMATHEMATICAL MODEL FOR OIL EXTRACTION FROM OLEAGINOUS SEEDS

    Directory of Open Access Journals (Sweden)

    Giuseppe Toscano

    2007-06-01

    Full Text Available Mechanical extraction from seeds represents an important process in the production of vegetable oils. The efficiency of this step can have an effect on the economic convenience of the entire production chain of vegetable oils. However, the mechanical presses used for extraction are designed following criteria based more on the experience and intuition of the operators than on rigorous analyses of the physical principles involved in the process. In this study we have tested the possibility of applying a mathematical model that reproduces oil extraction from seeds, on a laboratory type of continuous press. In other words, we have compared the results of our mathematical model with those obtained from real extractions with mechanical presses on sunflower seeds. Our model is based on determining the main operating parameters of mechanical extraction, such as temperature, pressure and compression time, and on the knowledge of some physical characteristics of the solid matrix of the seeds. The results obtained are interesting because they include the role of operating parameters involved in extraction while the application of the mathematical model studied here allows, although with potential for improvement, a mathematical instrument to be developed for optimising the sizing and the operating conditions of mechanical presses.

  4. Model-Based Extracted Water Desalination System for Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Dees, Elizabeth M. [General Electric Global Research Center, Niskayuna, NY (United States); Moore, David Roger [General Electric Global Research Center, Niskayuna, NY (United States); Li, Li [Pennsylvania State Univ., University Park, PA (United States); Kumar, Manish [Pennsylvania State Univ., University Park, PA (United States)

    2017-05-28

    Over the last 1.5 years, GE Global Research and Pennsylvania State University defined a model-based, scalable, and multi-stage extracted water desalination system that yields clean water, concentrated brine, and, optionally, salt. The team explored saline brines that ranged across the expected range for extracted water for carbon sequestration reservoirs (40,000 up to 220,000 ppm total dissolved solids, TDS). In addition, the validated the system performance at pilot scale with field-sourced water using GE’s pre-pilot and lab facilities. This project encompassed four principal tasks, in addition to Project Management and Planning: 1) identify a deep saline formation carbon sequestration site and a partner that are suitable for supplying extracted water; 2) conduct a techno-economic assessment and down-selection of pre-treatment and desalination technologies to identify a cost-effective system for extracted water recovery; 3) validate the downselected processes at the lab/pre-pilot scale; and 4) define the scope of the pilot desalination project. Highlights from each task are described below: Deep saline formation characterization The deep saline formations associated with the five DOE NETL 1260 Phase 1 projects were characterized with respect to their mineralogy and formation water composition. Sources of high TDS feed water other than extracted water were explored for high TDS desalination applications, including unconventional oil and gas and seawater reverse osmosis concentrate. Technoeconomic analysis of desalination technologies Techno-economic evaluations of alternate brine concentration technologies, including humidification-dehumidification (HDH), membrane distillation (MD), forward osmosis (FO), turboexpander-freeze, solvent extraction and high pressure reverse osmosis (HPRO), were conducted. These technologies were evaluated against conventional falling film-mechanical vapor recompression (FF-MVR) as a baseline desalination process. Furthermore, a

  5. Kinetics Extraction Modelling and Antiproliferative Activity of Clinacanthus nutans Water Extract

    Directory of Open Access Journals (Sweden)

    Farah Nadiah Mohd Fazil

    2016-01-01

    Full Text Available Clinacanthus nutans is widely grown in tropical Asia and locally known “belalai gajah” or Sabah snake grass. It has been used as a natural product to treat skin rashes, snake bites, lesion caused by herpes, diabetes, fever, and cancer. Therefore, the objectives of this research are to determine the maximum yield and time of exhaustive flavonoids extraction using Peleg’s model and to evaluate potential of antiproliferative activity on human lung cancer cell (A549. The extraction process was carried out on fresh and dried leaves at 28 to 30°C with liquid-to-solid ratio of 10 mL/g for 72 hrs. The extracts were collected intermittently analysed using mathematical Peleg’s model and RP-HPLC. The highest amount of flavonoids was used to evaluate the inhibitory concentration (IC50 via 2D cell culture of A549. Based on the results obtained, the predicted maximum extract density was observed at 29.20 ± 14.54 hrs of extraction (texhaustive. However, the exhaustive time of extraction to acquire maximum flavonoids content exhibited approximately 10 hrs earlier. Therefore, 18 hrs of extraction time was chosen to acquire high content of flavonoids. The best antiproliferative effect (IC50 on A549 cell line was observed at 138.82 ± 0.60 µg/mL. In conclusion, the flavonoids content in Clinacanthus nutans water extract possesses potential antiproliferative properties against A549, suggesting an alternative approach for cancer treatment.

  6. A Hierarchical Feature Extraction Model for Multi-Label Mechanical Patent Classification

    Directory of Open Access Journals (Sweden)

    Jie Hu

    2018-01-01

    Full Text Available Various studies have focused on feature extraction methods for automatic patent classification in recent years. However, most of these approaches are based on the knowledge from experts in related domains. Here we propose a hierarchical feature extraction model (HFEM for multi-label mechanical patent classification, which is able to capture both local features of phrases as well as global and temporal semantics. First, a n-gram feature extractor based on convolutional neural networks (CNNs is designed to extract salient local lexical-level features. Next, a long dependency feature extraction model based on the bidirectional long–short-term memory (BiLSTM neural network model is proposed to capture sequential correlations from higher-level sequence representations. Then the HFEM algorithm and its hierarchical feature extraction architecture are detailed. We establish the training, validation and test datasets, containing 72,532, 18,133, and 2679 mechanical patent documents, respectively, and then check the performance of HFEMs. Finally, we compared the results of the proposed HFEM and three other single neural network models, namely CNN, long–short-term memory (LSTM, and BiLSTM. The experimental results indicate that our proposed HFEM outperforms the other compared models in both precision and recall.

  7. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  8. Lambert W-function based exact representation for double diode model of solar cells: Comparison on fitness and parameter extraction

    International Nuclear Information System (INIS)

    Gao, Xiankun; Cui, Yan; Hu, Jianjun; Xu, Guangyin; Yu, Yongchang

    2016-01-01

    Highlights: • Lambert W-function based exact representation (LBER) is presented for double diode model (DDM). • Fitness difference between LBER and DDM is verified by reported parameter values. • The proposed LBER can better represent the I–V and P–V characteristics of solar cells. • Parameter extraction difference between LBER and DDM is validated by two algorithms. • The parameter values extracted from LBER are more accurate than those from DDM. - Abstract: Accurate modeling and parameter extraction of solar cells play an important role in the simulation and optimization of PV systems. This paper presents a Lambert W-function based exact representation (LBER) for traditional double diode model (DDM) of solar cells, and then compares their fitness and parameter extraction performance. Unlike existing works, the proposed LBER is rigorously derived from DDM, and in LBER the coefficients of Lambert W-function are not extra parameters to be extracted or arbitrary scalars but the vectors of terminal voltage and current of solar cells. The fitness difference between LBER and DDM is objectively validated by the reported parameter values and experimental I–V data of a solar cell and four solar modules from different technologies. The comparison results indicate that under the same parameter values, the proposed LBER can better represent the I–V and P–V characteristics of solar cells and provide a closer representation to actual maximum power points of all module types. Two different algorithms are used to compare the parameter extraction performance of LBER and DDM. One is our restart-based bound constrained Nelder-Mead (rbcNM) algorithm implemented in Matlab, and the other is the reported R_c_r-IJADE algorithm executed in Visual Studio. The comparison results reveal that, the parameter values extracted from LBER using two algorithms are always more accurate and robust than those from DDM despite more time consuming. As an improved version of DDM, the

  9. [Mass Transfer Kinetics Model of Ultrasonic Extraction of Pomegranate Peel Polyphenols].

    Science.gov (United States)

    Wang, Zhan-yi; Zhang, Li-hua; Wang, Yu-hai; Zhang, Yuan-hu; Ma, Li; Zheng, Dan-dan

    2015-05-01

    The dynamic mathematical model of ultrasonic extraction of polyphenols from pomegranate peel was constructed with the Fick's second law as the theoretical basis. The spherical model was selected, with mass concentrations of pomegranate peel polyphenols as the index, 50% ethanol as the extraction solvent and ultrasonic extraction as the extraction method. In different test conditions including the liquid ratio, extraction temperature and extraction time, a series of kinetic parameters were solved, such as the extraction process (k), relative raffinate rate, surface diffusion coefficient(D(S)), half life (t½) and the apparent activation energy (E(a)). With the extraction temperature increasing, k and D(S) were gradually increased with t½ decreasing,which indicated that the elevated temperature was favorable to the extraction of pomegranate peel polyphenols. The exponential equation of relative raffinate rate showed that the established numerical dynamics model fitted the extraction of pomegranate peel polyphenols, and the relationship between the reaction conditions and pomegranate peel polyphenols concentration was well reflected by the model. Based on the experimental results, a feasible and reliable kinetic model for ultrasonic extraction of polyphenols from pomegranate peel is established, which can be used for the optimization control of engineering magnifying production.

  10. Modeling and Extraction of Parameters Based on Physical Effects in Bipolar Transistors

    Directory of Open Access Journals (Sweden)

    Agnes Nagy

    2011-01-01

    Full Text Available The rising complexity of electronic systems, the reduction of components size, and the increment of working frequencies demand every time more accurate and stable integrated circuits, which require more precise simulation programs during the design process. PSPICE, widely used to simulate the general behavior of integrated circuits, does not consider many of the physical effects that can be found in real devices. Compact models, HICUM and MEXTRAM, have been developed over recent decades, in order to eliminate this deficiency. This paper presents some of the physical aspects that have not been studied so far, such as the expression of base-emitter voltage, including the emitter emission coefficient effect (n, physical explanation and simulation procedure, as well as a new extraction method for the diffusion potential VDE(T, based on the forward biased base-emitter capacitance, showing excellent agreement between experimental and theoretical results.

  11. Lung region extraction based on the model information and the inversed MIP method by using chest CT images

    International Nuclear Information System (INIS)

    Tomita, Toshihiro; Miguchi, Ryosuke; Okumura, Toshiaki; Yamamoto, Shinji; Matsumoto, Mitsuomi; Tateno, Yukio; Iinuma, Takeshi; Matsumoto, Toru.

    1997-01-01

    We developed a lung region extraction method based on the model information and the inversed MIP method in the Lung Cancer Screening CT (LSCT). Original model is composed of typical 3-D lung contour lines, a body axis, an apical point, and a convex hull. First, the body axis. the apical point, and the convex hull are automatically extracted from the input image Next, the model is properly transformed to fit to those of input image by the affine transformation. Using the same affine transformation coefficients, typical lung contour lines are also transferred, which correspond to rough contour lines of input image. Experimental results applied for 68 samples showed this method quite promising. (author)

  12. Inverse hydrochemical models of aqueous extracts tests

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, L.; Samper, J.; Montenegro, L.

    2008-10-10

    Aqueous extract test is a laboratory technique commonly used to measure the amount of soluble salts of a soil sample after adding a known mass of distilled water. Measured aqueous extract data have to be re-interpreted in order to infer porewater chemical composition of the sample because porewater chemistry changes significantly due to dilution and chemical reactions which take place during extraction. Here we present an inverse hydrochemical model to estimate porewater chemical composition from measured water content, aqueous extract, and mineralogical data. The model accounts for acid-base, redox, aqueous complexation, mineral dissolution/precipitation, gas dissolution/ex-solution, cation exchange and surface complexation reactions, of which are assumed to take place at local equilibrium. It has been solved with INVERSE-CORE{sup 2D} and been tested with bentonite samples taken from FEBEX (Full-scale Engineered Barrier EXperiment) in situ test. The inverse model reproduces most of the measured aqueous data except bicarbonate and provides an effective, flexible and comprehensive method to estimate porewater chemical composition of clays. Main uncertainties are related to kinetic calcite dissolution and variations in CO2(g) pressure.

  13. Novel thiosalicylate-based ionic liquids for heavy metal extractions

    Energy Technology Data Exchange (ETDEWEB)

    Leyma, Raphlin; Platzer, Sonja [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Jirsa, Franz [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Department of Zoology, University of Johannesburg, PO Box 524, Auckland Park, 2006, Johannesburg (South Africa); Kandioller, Wolfgang, E-mail: wolfgang.kandioller@univie.ac.at [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Krachler, Regina; Keppler, Bernhard K. [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria)

    2016-08-15

    Highlights: • Six thiosalicylate-based ammonium and phosphonium ionic liquids (ILs) were newly synthesized. • ILs showed good extraction of cadmium, copper, and zinc. • Phosphonium ILs showed better extraction efficiencies than their ammonium counterparts. - Abstract: This study aims to develop novel ammonium and phosphonium ionic liquids (ILs) with thiosalicylate (TS) derivatives as anions and evaluate their extracting efficiencies towards heavy metals in aqueous solutions. Six ILs were synthesized, characterized, and investigated for their extracting efficacies for cadmium, copper, and zinc. Liquid-liquid extractions of Cu, Zn, or Cd with ILs after 1–24 h using model solutions (pH 7; 0.1 M CaCl{sub 2}) were assessed using flame atomic absorption spectroscopy (F-AAS). Phosphonium-based ILs trihexyltetradecylphosphonium 2-(propylthio)benzoate [P{sub 66614}][PTB] and 2-(benzylthio)benzoate [P{sub 66614}][BTB] showed best extraction efficiency for copper and cadmium, respectively and zinc was extracted to a high degree by [P{sub 66614}][BTB] exclusively.

  14. A simple model for super critical fluid extraction of bio oils from biomass

    International Nuclear Information System (INIS)

    Patel, Rajesh N.; Bandyopadhyay, Santanu; Ganesh, Anuradda

    2011-01-01

    A simple mathematical model to characterize the supercritical extraction process has been proposed in this paper. This model is primarily based on two mass transfer mechanisms: solubility and diffusion. The model assumes two districts mode of extraction: initial constant rate extraction that is controlled by solubility and falling rate extraction that is controlled by diffusivity. Effects of extraction parameters such as pressure and temperature on the extraction of oil have also been studied. The proposed model, when compared with existing models, shows better agreement with the experimental results. The proposed model developed has been applied for both high initial oil content material (cashew nut shells) and low initial oil content material (black pepper).

  15. Data Extraction Based on Page Structure Analysis

    Directory of Open Access Journals (Sweden)

    Ren Yichao

    2017-01-01

    Full Text Available The information we need has some confusing problems such as dispersion and different organizational structure. In addition, because of the existence of unstructured data like natural language and images, extracting local content pages is extremely difficult. In the light of of the problems above, this article will apply a method combined with page structure analysis algorithm and page data extraction algorithm to accomplish the gathering of network data. In this way, the problem that traditional complex extraction model behave poorly when dealing with large-scale data is perfectly solved and the page data extraction efficiency is also boosted to a new level. In the meantime, the article will also make a comparison about pages and content of different types between the methods of DOM structure based on the page and HTML regularities of distribution. After all of those, we may find a more efficient extract method.

  16. Sparsity-based shrinkage approach for practicability improvement of H-LBP-based edge extraction

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Chenyi [School of Physics, Northeast Normal University, Changchun 130024 (China); Qiao, Shuang, E-mail: qiaos810@nenu.edu.cn [School of Physics, Northeast Normal University, Changchun 130024 (China); Sun, Jianing, E-mail: sunjn118@nenu.edu.cn [School of Mathematics and Statistics, Northeast Normal University, Changchun 130024 (China); Zhao, Ruikun; Wu, Wei [Jilin Cancer Hospital, Changchun 130021 (China)

    2016-07-21

    The local binary pattern with H function (H-LBP) technique enables fast and efficient edge extraction in digital radiography. In this paper, we reformulate the model of H-LBP and propose a novel sparsity-based shrinkage approach, in which the threshold can be adapted to the data sparsity. Using this model, we upgrade fast H-LBP framework and apply it to real digital radiography. The experiments show that the method improved using the new shrinkage approach can avoid elaborately artificial modulation of parameters and possess greater robustness in edge extraction compared with the other current methods without increasing processing time. - Highlights: • An novel sparsity-based shrinkage approach for edge extraction on digital radiography is proposed. • The threshold of SS-LBP can be adaptive to the data sparsity. • SS-LBP is the development of AH-LBP and H-LBP. • Without boosting processing time and losing processing efficiency, SS-LBP can avoid elaborately artificial modulation of parameters provides. • SS-LBP has more robust performance in edge extraction compared with the existing methods.

  17. Gallbladder shape extraction from ultrasound images using active contour models.

    Science.gov (United States)

    Ciecholewski, Marcin; Chochołowicz, Jakub

    2013-12-01

    Gallbladder function is routinely assessed using ultrasonographic (USG) examinations. In clinical practice, doctors very often analyse the gallbladder shape when diagnosing selected disorders, e.g. if there are turns or folds of the gallbladder, so extracting its shape from USG images using supporting software can simplify a diagnosis that is often difficult to make. The paper describes two active contour models: the edge-based model and the region-based model making use of a morphological approach, both designed for extracting the gallbladder shape from USG images. The active contour models were applied to USG images without lesions and to those showing specific disease units, namely, anatomical changes like folds and turns of the gallbladder as well as polyps and gallstones. This paper also presents modifications of the edge-based model, such as the method for removing self-crossings and loops or the method of dampening the inflation force which moves nodes if they approach the edge being determined. The user is also able to add a fragment of the approximated edge beyond which neither active contour model will move if this edge is incomplete in the USG image. The modifications of the edge-based model presented here allow more precise results to be obtained when extracting the shape of the gallbladder from USG images than if the morphological model is used. © 2013 Elsevier Ltd. Published by Elsevier Ltd. All rights reserved.

  18. An automatic rat brain extraction method based on a deformable surface model.

    Science.gov (United States)

    Li, Jiehua; Liu, Xiaofeng; Zhuo, Jiachen; Gullapalli, Rao P; Zara, Jason M

    2013-08-15

    The extraction of the brain from the skull in medical images is a necessary first step before image registration or segmentation. While pre-clinical MR imaging studies on small animals, such as rats, are increasing, fully automatic imaging processing techniques specific to small animal studies remain lacking. In this paper, we present an automatic rat brain extraction method, the Rat Brain Deformable model method (RBD), which adapts the popular human brain extraction tool (BET) through the incorporation of information on the brain geometry and MR image characteristics of the rat brain. The robustness of the method was demonstrated on T2-weighted MR images of 64 rats and compared with other brain extraction methods (BET, PCNN, PCNN-3D). The results demonstrate that RBD reliably extracts the rat brain with high accuracy (>92% volume overlap) and is robust against signal inhomogeneity in the images. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. [Research on optimal modeling strategy for licorice extraction process based on near-infrared spectroscopy technology].

    Science.gov (United States)

    Wang, Hai-Xia; Suo, Tong-Chuan; Yu, He-Shui; Li, Zheng

    2016-10-01

    The manufacture of traditional Chinese medicine (TCM) products is always accompanied by processing complex raw materials and real-time monitoring of the manufacturing process. In this study, we investigated different modeling strategies for the extraction process of licorice. Near-infrared spectra associate with the extraction time was used to detemine the states of the extraction processes. Three modeling approaches, i.e., principal component analysis (PCA), partial least squares regression (PLSR) and parallel factor analysis-PLSR (PARAFAC-PLSR), were adopted for the prediction of the real-time status of the process. The overall results indicated that PCA, PLSR and PARAFAC-PLSR can effectively detect the errors in the extraction procedure and predict the process trajectories, which has important significance for the monitoring and controlling of the extraction processes. Copyright© by the Chinese Pharmaceutical Association.

  20. Deep-depletion physics-based analytical model for scanning capacitance microscopy carrier profile extraction

    International Nuclear Information System (INIS)

    Wong, K. M.; Chim, W. K.

    2007-01-01

    An approach for fast and accurate carrier profiling using deep-depletion analytical modeling of scanning capacitance microscopy (SCM) measurements is shown for an ultrashallow p-n junction with a junction depth of less than 30 nm and a profile steepness of about 3 nm per decade change in carrier concentration. In addition, the analytical model is also used to extract the SCM dopant profiles of three other p-n junction samples with different junction depths and profile steepnesses. The deep-depletion effect arises from rapid changes in the bias applied between the sample and probe tip during SCM measurements. The extracted carrier profile from the model agrees reasonably well with the more accurate carrier profile from inverse modeling and the dopant profile from secondary ion mass spectroscopy measurements

  1. Template based rodent brain extraction and atlas mapping.

    Science.gov (United States)

    Weimin Huang; Jiaqi Zhang; Zhiping Lin; Su Huang; Yuping Duan; Zhongkang Lu

    2016-08-01

    Accurate rodent brain extraction is the basic step for many translational studies using MR imaging. This paper presents a template based approach with multi-expert refinement to automatic rodent brain extraction. We first build the brain appearance model based on the learning exemplars. Together with the template matching, we encode the rodent brain position into the search space to reliably locate the rodent brain and estimate the rough segmentation. With the initial mask, a level-set segmentation and a mask-based template learning are implemented further to the brain region. The multi-expert fusion is used to generate a new mask. We finally combine the region growing based on the histogram distribution learning to delineate the final brain mask. A high-resolution rodent atlas is used to illustrate that the segmented low resolution anatomic image can be well mapped to the atlas. Tested on a public data set, all brains are located reliably and we achieve the mean Jaccard similarity score at 94.99% for brain segmentation, which is a statistically significant improvement compared to two other rodent brain extraction methods.

  2. An improved active contour model for glacial lake extraction

    Science.gov (United States)

    Zhao, H.; Chen, F.; Zhang, M.

    2017-12-01

    Active contour model is a widely used method in visual tracking and image segmentation. Under the driven of objective function, the initial curve defined in active contour model will evolve to a stable condition - a desired result in given image. As a typical region-based active contour model, C-V model has a good effect on weak boundaries detection and anti noise ability which shows great potential in glacial lake extraction. Glacial lake is a sensitive indicator for reflecting global climate change, therefore accurate delineate glacial lake boundaries is essential to evaluate hydrologic environment and living environment. However, the current method in glacial lake extraction mainly contains water index method and recognition classification method are diffcult to directly applied in large scale glacial lake extraction due to the diversity of glacial lakes and masses impacted factors in the image, such as image noise, shadows, snow and ice, etc. Regarding the abovementioned advantanges of C-V model and diffcults in glacial lake extraction, we introduce the signed pressure force function to improve the C-V model for adapting to processing of glacial lake extraction. To inspect the effect of glacial lake extraction results, three typical glacial lake development sites were selected, include Altai mountains, Centre Himalayas, South-eastern Tibet, and Landsat8 OLI imagery was conducted as experiment data source, Google earth imagery as reference data for varifying the results. The experiment consequence suggests that improved active contour model we proposed can effectively discriminate the glacial lakes from complex backgound with a higher Kappa Coefficient - 0.895, especially in some small glacial lakes which belongs to weak information in the image. Our finding provide a new approach to improved accuracy under the condition of large proportion of small glacial lakes and the possibility for automated glacial lake mapping in large-scale area.

  3. Theoretical models for supercritical fluid extraction.

    Science.gov (United States)

    Huang, Zhen; Shi, Xiao-Han; Jiang, Wei-Juan

    2012-08-10

    For the proper design of supercritical fluid extraction processes, it is essential to have a sound knowledge of the mass transfer mechanism of the extraction process and the appropriate mathematical representation. In this paper, the advances and applications of kinetic models for describing supercritical fluid extraction from various solid matrices have been presented. The theoretical models overviewed here include the hot ball diffusion, broken and intact cell, shrinking core and some relatively simple models. Mathematical representations of these models have been in detail interpreted as well as their assumptions, parameter identifications and application examples. Extraction process of the analyte solute from the solid matrix by means of supercritical fluid includes the dissolution of the analyte from the solid, the analyte diffusion in the matrix and its transport to the bulk supercritical fluid. Mechanisms involved in a mass transfer model are discussed in terms of external mass transfer resistance, internal mass transfer resistance, solute-solid interactions and axial dispersion. The correlations of the external mass transfer coefficient and axial dispersion coefficient with certain dimensionless numbers are also discussed. Among these models, the broken and intact cell model seems to be the most relevant mathematical model as it is able to provide realistic description of the plant material structure for better understanding the mass-transfer kinetics and thus it has been widely employed for modeling supercritical fluid extraction of natural matters. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Application of cone packing model in extraction: Pt.1

    International Nuclear Information System (INIS)

    Li Xingfu; Tang Bolin; Sun Pengnian

    1987-01-01

    Synergistic effect in trinary system: substrate 1 + substrate 2 + extractant, is reported with the extraction for uranyl ion in mixed acetate and chloride substrates by applying cone packing model. Based on the cone packing model, ligand packing around the uranyl equatorial plane should be neither overcrowded nor undercrowded. The sum of ligand's Fan Angle (FA) around the uranyl equatorial plane provides a useful criterion for estimating steric crowding. Whereas FAS of UO 2 (OAc) 2 (TBP) 2 , UO 2 (OAc) 2 TBP, UO 2 Cl 2 (TBP) 2 and UO 2 Cl 2 (TBP) 3 are 204, 164, 164 and 203 deg respectively, none is favourable to form stable extracted complexes. FAS of UO 2 (OAc)Cl(TBP) 2 is l84 deg indicating the most favorable packing. Therefore synergistic effect based on the formation of extracted complex UO 2 (OAc)Cl(TBP) 2 in mixed acetate and chloride solution was predicted. The above speculation was confirmed by extraction of uranyl ion in the substrates of sodium acetate-acetic acid and ammonium chloride. Keeping the total concentration of substrates constant, synergistic extraction is found at l:3 acetate to chloride ratio which shifts a little on changing concentration of the uranyl ion from 3.97 x 10 -4 mol·dm -3 to 2.06 x 10 -2 mol·dm -3 . The extracted complex was proved to be mostly UO 2 (OAc)Cl(TBP) 2 by using the slope method and analysing the uranyl and chloride concentration in organic phase. Variation of the distribution coefficient of synergistic effect with the pH of the medium was studied. The similiar effect was found with other oxtractants and diluents

  5. Optimization of β-cyclodextrin-based flavonol extraction from apple pomace using response surface methodology.

    Science.gov (United States)

    Parmar, Indu; Sharma, Sowmya; Rupasinghe, H P Vasantha

    2015-04-01

    The present study investigated five cyclodextrins (CDs) for the extraction of flavonols from apple pomace powder and optimized β-CD based extraction of total flavonols using response surface methodology. A 2(3) central composite design with β-CD concentration (0-5 g 100 mL(-1)), extraction temperature (20-72 °C), extraction time (6-48 h) and second-order quadratic model for the total flavonol yield (mg 100 g(-1) DM) was selected to generate the response surface curves. The optimal conditions obtained were: β-CD concentration, 2.8 g 100 mL(-1); extraction temperature, 45 °C and extraction time, 25.6 h that predicted the extraction of 166.6 mg total flavonols 100 g(-1) DM. The predicted amount was comparable to the experimental amount of 151.5 mg total flavonols 100 g(-1) DM obtained from optimal β-CD based parameters, thereby giving a low absolute error and adequacy of fitted model. In addition, the results from optimized extraction conditions showed values similar to those obtained through previously established solvent based sonication assisted flavonol extraction procedure. To the best of our knowledge, this is the first study to optimize aqueous β-CD based flavonol extraction which presents an environmentally safe method for value-addition to under-utilized bio resources.

  6. Large-scale parameter extraction in electrocardiology models through Born approximation

    KAUST Repository

    He, Yuan

    2012-12-04

    One of the main objectives in electrocardiology is to extract physical properties of cardiac tissues from measured information on electrical activity of the heart. Mathematically, this is an inverse problem for reconstructing coefficients in electrocardiology models from partial knowledge of the solutions of the models. In this work, we consider such parameter extraction problems for two well-studied electrocardiology models: the bidomain model and the FitzHugh-Nagumo model. We propose a systematic reconstruction method based on the Born approximation of the original nonlinear inverse problem. We describe a two-step procedure that allows us to reconstruct not only perturbations of the unknowns, but also the backgrounds around which the linearization is performed. We show some numerical simulations under various conditions to demonstrate the performance of our method. We also introduce a parameterization strategy using eigenfunctions of the Laplacian operator to reduce the number of unknowns in the parameter extraction problem. © 2013 IOP Publishing Ltd.

  7. Patent Keyword Extraction Algorithm Based on Distributed Representation for Patent Classification

    Directory of Open Access Journals (Sweden)

    Jie Hu

    2018-02-01

    Full Text Available Many text mining tasks such as text retrieval, text summarization, and text comparisons depend on the extraction of representative keywords from the main text. Most existing keyword extraction algorithms are based on discrete bag-of-words type of word representation of the text. In this paper, we propose a patent keyword extraction algorithm (PKEA based on the distributed Skip-gram model for patent classification. We also develop a set of quantitative performance measures for keyword extraction evaluation based on information gain and cross-validation, based on Support Vector Machine (SVM classification, which are valuable when human-annotated keywords are not available. We used a standard benchmark dataset and a homemade patent dataset to evaluate the performance of PKEA. Our patent dataset includes 2500 patents from five distinct technological fields related to autonomous cars (GPS systems, lidar systems, object recognition systems, radar systems, and vehicle control systems. We compared our method with Frequency, Term Frequency-Inverse Document Frequency (TF-IDF, TextRank and Rapid Automatic Keyword Extraction (RAKE. The experimental results show that our proposed algorithm provides a promising way to extract keywords from patent texts for patent classification.

  8. Feature Fusion Based Road Extraction for HJ-1-C SAR Image

    Directory of Open Access Journals (Sweden)

    Lu Ping-ping

    2014-06-01

    Full Text Available Road network extraction in SAR images is one of the key tasks of military and civilian technologies. To solve the issues of road extraction of HJ-1-C SAR images, a road extraction algorithm is proposed based on the integration of ratio and directional information. Due to the characteristic narrow dynamic range and low signal to noise ratio of HJ-1-C SAR images, a nonlinear quantization and an image filtering method based on a multi-scale autoregressive model are proposed here. A road extraction algorithm based on information fusion, which considers ratio and direction information, is also proposed. By processing Radon transformation, main road directions can be extracted. Cross interferences can be suppressed, and the road continuity can then be improved by the main direction alignment and secondary road extraction. The HJ-1-C SAR image acquired in Wuhan, China was used to evaluate the proposed method. The experimental results show good performance with correctness (80.5% and quality (70.1% when applied to a SAR image with complex content.

  9. Object feature extraction and recognition model

    International Nuclear Information System (INIS)

    Wan Min; Xiang Rujian; Wan Yongxing

    2001-01-01

    The characteristics of objects, especially flying objects, are analyzed, which include characteristics of spectrum, image and motion. Feature extraction is also achieved. To improve the speed of object recognition, a feature database is used to simplify the data in the source database. The feature vs. object relationship maps are stored in the feature database. An object recognition model based on the feature database is presented, and the way to achieve object recognition is also explained

  10. State space model extraction of thermohydraulic systems – Part II: A linear graph approach applied to a Brayton cycle-based power conversion unit

    International Nuclear Information System (INIS)

    Uren, Kenneth Richard; Schoor, George van

    2013-01-01

    This second paper in a two part series presents the application of a developed state space model extraction methodology applied to a Brayton cycle-based PCU (power conversion unit) of a PBMR (pebble bed modular reactor). The goal is to investigate if the state space extraction methodology can cope with larger and more complex thermohydraulic systems. In Part I the state space model extraction methodology for the purpose of control was described in detail and a state space representation was extracted for a U-tube system to illustrate the concept. In this paper a 25th order nonlinear state space representation in terms of the different energy domains is extracted. This state space representation is solved and the responses of a number of important states are compared with results obtained from a PBMR PCU Flownex ® model. Flownex ® is a validated thermo fluid simulation software package. The results show that the state space model closely resembles the dynamics of the PBMR PCU. This kind of model may be used for nonlinear MIMO (multi-input, multi-output) type of control strategies. However, there is still a need for linear state space models since many control system design and analysis techniques require a linear state space model. This issue is also addressed in this paper by showing how a linear state space model can be derived from the extracted nonlinear state space model. The linearised state space model is also validated by comparing the state space model to an existing linear Simulink ® model of the PBMR PCU system. - Highlights: • State space model extraction of a pebble bed modular reactor PCU (power conversion unit). • A 25th order nonlinear time varying state space model is obtained. • Linearisation of a nonlinear state space model for use in power output control. • Non-minimum phase characteristic that is challenging in terms of control. • Models derived are useful for MIMO control strategies

  11. Extraction or adsorption? Voltammetric assessment of protamine transfer at ionophore-based polymeric membranes.

    Science.gov (United States)

    Garada, Mohammed B; Kabagambe, Benjamin; Amemiya, Shigeru

    2015-01-01

    Cation-exchange extraction of polypeptide protamine from water into an ionophore-based polymeric membrane has been hypothesized as the origin of a potentiometric sensor response to this important heparin antidote. Here, we apply ion-transfer voltammetry not only to confirm protamine extraction into ionophore-doped polymeric membranes but also to reveal protamine adsorption at the membrane/water interface. Protamine adsorption is thermodynamically more favorable than protamine extraction as shown by cyclic voltammetry at plasticized poly(vinyl chloride) membranes containing dinonylnaphthalenesulfonate as a protamine-selective ionophore. Reversible adsorption of protamine at low concentrations down to 0.038 μg/mL is demonstrated by stripping voltammetry. Adsorptive preconcentration of protamine at the membrane/water interface is quantitatively modeled by using the Frumkin adsorption isotherm. We apply this model to ensure that stripping voltammograms are based on desorption of all protamine molecules that are transferred across the interface during a preconcentration step. In comparison to adsorption, voltammetric extraction of protamine requires ∼0.2 V more negative potentials, where a potentiometric super-Nernstian response to protamine is also observed. This agreement confirms that the potentiometric protamine response is based on protamine extraction. The voltammetrically reversible protamine extraction results in an apparently irreversible potentiometric response to protamine because back-extraction of protamine from the membrane extremely slows down at the mixed potential based on cation-exchange extraction of protamine. Significantly, this study demonstrates the advantages of ion-transfer voltammetry over potentiometry to quantitatively and mechanistically assess protamine transfer at ionophore-based polymeric membranes as foundation for reversible, selective, and sensitive detection of protamine.

  12. Supercritical extraction of carqueja essential oil: experiments and modeling

    Directory of Open Access Journals (Sweden)

    R. M. F. Vargas

    2006-09-01

    Full Text Available Baccharis trimera is a native Brazilian plant which has medicinal properties. In this work a method of supercritical extraction was studied to obtain the popularly essential oil from Baccharis trimera, known as carqueja. The aim was to obtain experimental data and to compare two mathematical models used in the simulation of carqueja (Baccharis trimera oil extraction by supercritical CO2. The two mathematical models are based on mass transfer. One of the models, proposed by Reverchon, is solved numerically and requires two adjustable parameters from the experimental data. The other model chosen is the one proposed by Sovová. This model is solved analytically and requires four adjustable parameters. Numerical results are presented and discussed for the adjusted parameters. The experimental results are obtained in a temperature range of 313.15 K to 343.15 K at 90 bar. The extraction yield of carqueja essential oil using supercritical carbon dioxide ranged between 1.72 % (w/w at 323.15 K and 2.34 % (w/w at 343.15 K, 90 bar with a CO2 flow rate of 3.34.10-8 m³/s for a 0.0015 kg sample of Baccharis trimera.

  13. Highway extraction from high resolution aerial photography using a geometric active contour model

    Science.gov (United States)

    Niu, Xutong

    Highway extraction and vehicle detection are two of the most important steps in traffic-flow analysis from multi-frame aerial photographs. The traditional method of deriving traffic flow trajectories relies on manual vehicle counting from a sequence of aerial photographs, which is tedious and time-consuming. This research presents a new framework for semi-automatic highway extraction. The basis of the new framework is an improved geometric active contour (GAC) model. This novel model seeks to minimize an objective function that transforms a problem of propagation of regular curves into an optimization problem. The implementation of curve propagation is based on level set theory. By using an implicit representation of a two-dimensional curve, a level set approach can be used to deal with topological changes naturally, and the output is unaffected by different initial positions of the curve. However, the original GAC model, on which the new model is based, only incorporates boundary information into the curve propagation process. An error-producing phenomenon called leakage is inevitable wherever there is an uncertain weak edge. In this research, region-based information is added as a constraint into the original GAC model, thereby, giving this proposed method the ability of integrating both boundary and region-based information during the curve propagation. Adding the region-based constraint eliminates the leakage problem. This dissertation applies the proposed augmented GAC model to the problem of highway extraction from high-resolution aerial photography. First, an optimized stopping criterion is designed and used in the implementation of the GAC model. It effectively saves processing time and computations. Second, a seed point propagation framework is designed and implemented. This framework incorporates highway extraction, tracking, and linking into one procedure. A seed point is usually placed at an end node of highway segments close to the boundary of the

  14. Parameter extraction of different fuel cell models with transferred adaptive differential evolution

    International Nuclear Information System (INIS)

    Gong, Wenyin; Yan, Xuesong; Liu, Xiaobo; Cai, Zhihua

    2015-01-01

    To improve the design and control of FC (fuel cell) models, it is important to extract their unknown parameters. Generally, the parameter extraction problems of FC models can be transformed as nonlinear and multi-variable optimization problems. To extract the parameters of different FC models exactly and fast, in this paper, we propose a transferred adaptive DE (differential evolution) framework, in which the successful parameters of the adaptive DE solving previous problems are properly transferred to solve new optimization problems in the similar problem-domains. Based on this framework, an improved adaptive DE method (TRADE, in short) is presented as an illustration. To verify the performance of our proposal, TRADE is used to extract the unknown parameters of two types of fuel cell models, i.e., PEMFC (proton exchange membrane fuel cell) and SOFC (solid oxide fuel cell). The results of TRADE are also compared with those of other state-of-the-art EAs (evolutionary algorithms). Even though the modification is very simple, the results indicate that TRADE can extract the parameters of both PEMFC and SOFC models exactly and fast. Moreover, the V–I characteristics obtained by TRADE agree well with the simulated and experimental data in all cases for both types of fuel cell models. Also, it improves the performance of the original adaptive DE significantly in terms of both the quality of final solutions and the convergence speed in all cases. Additionally, TRADE is able to provide better results compared with other EAs. - Highlights: • A framework of transferred adaptive differential evolution is proposed. • Based on the framework, an improved differential evolution (TRADE) is presented. • TRADE obtains very promising results to extract the parameters of PEMFC and SOFC models

  15. Model-based Extracted Water Desalination System for Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Gettings, Rachel; Dees, Elizabeth

    2017-03-23

    The focus of this research effort centered around water recovery from high Total Dissolved Solids (TDS) extracted waters (180,000 mg/L) using a combination of water recovery (partial desalination) technologies. The research goals of this project were as follows: 1. Define the scope and test location for pilot-scale implementation of the desalination system, 2.Define a scalable, multi-stage extracted water desalination system that yields clean water, concentrated brine, and, salt from saline brines, and 3. Validate overall system performance with field-sourced water using GE pre-pilot lab facilities. Conventional falling film-mechanical vapor recompression (FF-MVR) technology was established as a baseline desalination process. A quality function deployment (QFD) method was used to compare alternate high TDS desalination technologies to the base case FF-MVR technology, including but not limited to: membrane distillation (MD), forward osmosis (FO), and high pressure reverse osmosis (HPRO). Technoeconomic analysis of high pressure reverse osmosis (HPRO) was performed comparing the following two cases: 1. a hybrid seawater RO (SWRO) plus HPRO system and 2. 2x standard seawater RO system, to achieve the same total pure water recovery rate. Pre-pilot-scale tests were conducted using field production water to validate key process steps for extracted water pretreatment. Approximately 5,000 gallons of field produced water was processed through, microfiltration, ultrafiltration, and steam regenerable sorbent operations. Improvements in membrane materials of construction were considered as necessary next steps to achieving further improvement in element performance at high pressure. Several modifications showed promising results in their ability to withstand close to 5,000 PSI without gross failure.

  16. Numerical simulation of the heat extraction in EGS with thermal-hydraulic-mechanical coupling method based on discrete fractures model

    International Nuclear Information System (INIS)

    Sun, Zhi-xue; Zhang, Xu; Xu, Yi; Yao, Jun; Wang, Hao-xuan; Lv, Shuhuan; Sun, Zhi-lei; Huang, Yong; Cai, Ming-yu; Huang, Xiaoxue

    2017-01-01

    The Enhanced Geothermal System (EGS) creates an artificial geothermal reservoir by hydraulic fracturing which allows heat transmission through the fractures by the circulating fluids as they extract heat from Hot Dry Rock (HDR). The technique involves complex thermal–hydraulic–mechanical (THM) coupling process. A numerical approach is presented in this paper to simulate and analyze the heat extraction process in EGS. The reservoir is regarded as fractured porous media consisting of rock matrix blocks and discrete fracture networks. Based on thermal non-equilibrium theory, the mathematical model of THM coupling process in fractured rock mass is used. The proposed model is validated by comparing it with several analytical solutions. An EGS case from Cooper Basin, Australia is simulated with 2D stochastically generated fracture model to study the characteristics of fluid flow, heat transfer and mechanical response in geothermal reservoir. The main parameters controlling the outlet temperature of EGS are also studied by sensitivity analysis. The results shows the significance of taking into account the THM coupling effects when investigating the efficiency and performance of EGS. - Highlights: • EGS reservoir comprising discrete fracture networks and matrix rock is modeled. • A THM coupling model is proposed for simulating the heat extraction in EGS. • The numerical model is validated by comparing with several analytical solutions. • A case study is presented for understanding the main characteristics of EGS. • The THM coupling effects are shown to be significant factors to EGS's running performance.

  17. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  18. State space model extraction of thermohydraulic systems – Part I: A linear graph approach

    International Nuclear Information System (INIS)

    Uren, K.R.; Schoor, G. van

    2013-01-01

    Thermohydraulic simulation codes are increasingly making use of graphical design interfaces. The user can quickly and easily design a thermohydraulic system by placing symbols on the screen resembling system components. These components can then be connected to form a system representation. Such system models may then be used to obtain detailed simulations of the physical system. Usually this kind of simulation models are too complex and not ideal for control system design. Therefore, a need exists for automated techniques to extract lumped parameter models useful for control system design. The goal of this first paper, in a two part series, is to propose a method that utilises a graphical representation of a thermohydraulic system, and a lumped parameter modelling approach, to extract state space models. In this methodology each physical domain of the thermohydraulic system is represented by a linear graph. These linear graphs capture the interaction between all components within and across energy domains – hydraulic, thermal and mechanical. These linear graphs are analysed using a graph-theoretic approach to derive reduced order state space models. These models capture the dominant dynamics of the thermohydraulic system and are ideal for control system design purposes. The proposed state space model extraction method is demonstrated by considering a U-tube system. A non-linear state space model is extracted representing both the hydraulic and thermal domain dynamics of the system. The simulated state space model is compared with a Flownex ® model of the U-tube. Flownex ® is a validated systems thermal-fluid simulation software package. - Highlights: • A state space model extraction methodology based on graph-theoretic concepts. • An energy-based approach to consider multi-domain systems in a common framework. • Allow extraction of transparent (white-box) state space models automatically. • Reduced order models containing only independent state

  19. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  20. The complex formation-partition and partition-association models of solvent extraction of ions

    International Nuclear Information System (INIS)

    Siekierski, S.

    1976-01-01

    Two models of the extraction process have been proposed. In the first model it is assumed that the partitioning neutral species is at first formed in the aqueous phase and then transferred into the organic phase. The second model is based on the assumption that equivalent amounts of cations are at first transferred from the aqueous into the organic phase and then associated to form a neutral molecule. The role of the solubility parameter in extraction and the relation between the solubility of liquid organic substances in water and the partition of complexes have been discussed. The extraction of simple complexes and complexes with organic ligands has been discussed using the first model. Partition coefficients have been calculated theoretically and compared with experimental values in some very simple cases. The extraction of ion pairs has been discussed using the partition-association model and the concept of single-ion partition coefficients. (author)

  1. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  2. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha; Kalogerakis, Evangelos; Guibas, Leonidas; Koltun, Vladlen

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling

  3. Sieve-based relation extraction of gene regulatory networks from biological literature.

    Science.gov (United States)

    Žitnik, Slavko; Žitnik, Marinka; Zupan, Blaž; Bajec, Marko

    2015-01-01

    Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming

  4. Collaborative Filtering Based on Sequential Extraction of User-Item Clusters

    Science.gov (United States)

    Honda, Katsuhiro; Notsu, Akira; Ichihashi, Hidetomo

    Collaborative filtering is a computational realization of “word-of-mouth” in network community, in which the items prefered by “neighbors” are recommended. This paper proposes a new item-selection model for extracting user-item clusters from rectangular relation matrices, in which mutual relations between users and items are denoted in an alternative process of “liking or not”. A technique for sequential co-cluster extraction from rectangular relational data is given by combining the structural balancing-based user-item clustering method with sequential fuzzy cluster extraction appraoch. Then, the tecunique is applied to the collaborative filtering problem, in which some items may be shared by several user clusters.

  5. Individual Building Extraction from TerraSAR-X Images Based on Ontological Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Rong Gui

    2016-08-01

    Full Text Available Accurate building information plays a crucial role for urban planning, human settlements and environmental management. Synthetic aperture radar (SAR images, which deliver images with metric resolution, allow for analyzing and extracting detailed information on urban areas. In this paper, we consider the problem of extracting individual buildings from SAR images based on domain ontology. By analyzing a building scattering model with different orientations and structures, the building ontology model is set up to express multiple characteristics of individual buildings. Under this semantic expression framework, an object-based SAR image segmentation method is adopted to provide homogeneous image objects, and three categories of image object features are extracted. Semantic rules are implemented by organizing image object features, and the individual building objects expression based on an ontological semantic description is formed. Finally, the building primitives are used to detect buildings among the available image objects. Experiments on TerraSAR-X images of Foshan city, China, with a spatial resolution of 1.25 m × 1.25 m, have shown the total extraction rates are above 84%. The results indicate the ontological semantic method can exactly extract flat-roof and gable-roof buildings larger than 250 pixels with different orientations.

  6. Improved Cole parameter extraction based on the least absolute deviation method

    International Nuclear Information System (INIS)

    Yang, Yuxiang; Ni, Wenwen; Sun, Qiang; Wen, He; Teng, Zhaosheng

    2013-01-01

    The Cole function is widely used in bioimpedance spectroscopy (BIS) applications. Fitting the measured BIS data onto the model and then extracting the Cole parameters (R 0 , R ∞ , α and τ) is a common practice. Accurate extraction of the Cole parameters from the measured BIS data has great significance for evaluating the physiological or pathological status of biological tissue. The traditional least-squares (LS)-based curve fitting method for Cole parameter extraction is often sensitive to noise or outliers and becomes non-robust. This paper proposes an improved Cole parameter extraction based on the least absolute deviation (LAD) method. Comprehensive simulation experiments are carried out and the performances of the LAD method are compared with those of the LS method under the conditions of outliers, random noises and both disturbances. The proposed LAD method exhibits much better robustness under all circumstances, which demonstrates that the LAD method is deserving as an improved alternative to the LS method for Cole parameter extraction for its robustness to outliers and noises. (paper)

  7. A new approach to the extraction of single exponential diode model parameters

    Science.gov (United States)

    Ortiz-Conde, Adelmo; García-Sánchez, Francisco J.

    2018-06-01

    A new integration method is presented for the extraction of the parameters of a single exponential diode model with series resistance from the measured forward I-V characteristics. The extraction is performed using auxiliary functions based on the integration of the data which allow to isolate the effects of each of the model parameters. A differentiation method is also presented for data with low level of experimental noise. Measured and simulated data are used to verify the applicability of both proposed method. Physical insight about the validity of the model is also obtained by using the proposed graphical determinations of the parameters.

  8. Box-Behnken design based statistical modeling for ultrasound-assisted extraction of corn silk polysaccharide.

    Science.gov (United States)

    Prakash Maran, J; Manikandan, S; Thirugnanasambandham, K; Vigna Nivetha, C; Dinesh, R

    2013-01-30

    In this study, ultrasound assisted extraction (UAE) conditions on the yield of polysaccharide from corn silk were studied using three factors, three level Box-Behnken response surface design. Process parameters, which affect the efficiency of UAE such as extraction temperature (40-60 °C), time (10-30 min) and solid-liquid ratio (1:10-1:30 g/ml) were investigated. The results showed that, the extraction conditions have significant effects on extraction yield of polysaccharide. The obtained experimental data were fitted to a second-order polynomial equation using multiple regression analysis with high coefficient of determination value (R(2)) of 0.994. An optimization study using Derringer's desired function methodology was performed and the optimal conditions based on both individual and combinations of all independent variables (extraction temperature of 56 °C, time of 17 min and solid-liquid ratio of 1:20 g/ml) were determined with maximum polysaccharide yield of 6.06%, which was confirmed through validation experiments. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. A nonlinear model for frequency dispersion and DC intrinsic parameter extraction for GaN-based HEMT

    Science.gov (United States)

    Nguyen, Tung The-Lam; Kim, Sam-Dong

    2017-11-01

    We propose in this study a practical nonlinear model for the AlGaN/GaN high electron mobility transistors (HEMTs) to extract DC intrinsic transconductance (gmDC), output conductance (gdsDC), and electron mobility from the intrinsic parameter set measured at high frequencies. An excellent agreement in I-V characteristics of the model with a fitting error of 0.11% enables us successfully extract the gmDC, gdsDC, and the total transconductance dispersion. For this model, we also present a reliable analysis scheme wherein the frequency dispersion effect due regional surface states in AlGaN/GaN HEMTs is taken into account under various bias conditions.

  10. Extracting the invariant model from the feedback paths of digital hearing aids

    DEFF Research Database (Denmark)

    Ma, Guilin; Gran, Fredrik; Jacobsen, Finn

    2011-01-01

    environments given a specific type of hearing aids. Based on this observation, a feedback path model that consists of an invariant model and a variant model is proposed. A common-acoustical-pole and zero model-based approach and an iterative least-square search-based approach are used to extract the invariant...... model from a set of impulse responses of the feedback paths. A hybrid approach combining the two methods is also proposed. The general properties of the three methods are studied using artificial datasets, and the methods are cross-validated using the measured feedback paths. The results show...

  11. Parameter Extraction Method for the Electrical Model of a Silicon Photomultiplier

    Science.gov (United States)

    Licciulli, Francesco; Marzocca, Cristoforo

    2016-10-01

    The availability of an effective electrical model, able to accurately reproduce the signals generated by a Silicon Photo-Multiplier coupled to the front-end electronics, is mandatory when the performance of a detection system based on this kind of detector has to be evaluated by means of reliable simulations. We propose a complete extraction procedure able to provide the whole set of the parameters involved in a well-known model of the detector, which includes the substrate ohmic resistance. The technique allows achieving very good quality of the fit between simulation results provided by the model and experimental data, thanks to accurate discrimination between the quenching and substrate resistances, which results in a realistic set of extracted parameters. The extraction procedure has been applied to a commercial device considering a wide range of different conditions in terms of input resistance of the front-end electronics and interconnection parasitics. In all the considered situations, very good correspondence has been found between simulations and measurements, especially for what concerns the leading edge of the current pulses generated by the detector, which strongly affects the timing performance of the detection system, thus confirming the effectiveness of the model and the associated parameter extraction technique.

  12. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  13. Parameters extraction for perovskite solar cells based on Lambert W-function

    Directory of Open Access Journals (Sweden)

    Ge Junyu

    2016-01-01

    Full Text Available The behaviors of the solar cells are decided by the device parameters. Thus, it is necessary to extract these parameters to achieve the optimal working condition. Because the five-parameter model of solar cells has the implicit equation of current-voltage relationship, it is difficult to obtain the parameters with conventional methods. In this work, an optimized method is presented to extract device parameters from the actual test data of photovoltaic cell. Based on Lambert W-function, explicit formulation of the model can be deduced. The proposed technique takes suitable method of selecting sample points, which are used to calculate the values of the model parameters. By comparing with the Quasi-Newton method, the results verify accuracy and reliability of this method.

  14. Hidden discriminative features extraction for supervised high-order time series modeling.

    Science.gov (United States)

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A linear ion optics model for extraction from a plasma ion source

    International Nuclear Information System (INIS)

    Dietrich, J.

    1987-01-01

    A linear ion optics model for ion extraction from a plasma ion source is presented, based on the paraxial equations which account for lens effects, space charge and finite source ion temperature. This model is applied to three- and four-electrode extraction systems with circular apertures. The results are compared with experimental data and numerical calculations in the literature. It is shown that the improved calculations of space charge effects and lens effects allow better agreement to be obtained than in earlier linear optics models. A principal result is that the model presented here describes the dependence of the optimum perveance on the aspect ratio in a manner similar to the nonlinear optics theory. (orig.)

  16. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  17. Object Extraction in Cluttered Environments via a P300-Based IFCE

    Directory of Open Access Journals (Sweden)

    Xiaoqian Mao

    2017-01-01

    Full Text Available One of the fundamental issues for robot navigation is to extract an object of interest from an image. The biggest challenges for extracting objects of interest are how to use a machine to model the objects in which a human is interested and extract them quickly and reliably under varying illumination conditions. This article develops a novel method for segmenting an object of interest in a cluttered environment by combining a P300-based brain computer interface (BCI and an improved fuzzy color extractor (IFCE. The induced P300 potential identifies the corresponding region of interest and obtains the target of interest for the IFCE. The classification results not only represent the human mind but also deliver the associated seed pixel and fuzzy parameters to extract the specific objects in which the human is interested. Then, the IFCE is used to extract the corresponding objects. The results show that the IFCE delivers better performance than the BP network or the traditional FCE. The use of a P300-based IFCE provides a reliable solution for assisting a computer in identifying an object of interest within images taken under varying illumination intensities.

  18. A Transform-Based Feature Extraction Approach for Motor Imagery Tasks Classification

    Science.gov (United States)

    Khorshidtalab, Aida; Mesbah, Mostefa; Salami, Momoh J. E.

    2015-01-01

    In this paper, we present a new motor imagery classification method in the context of electroencephalography (EEG)-based brain–computer interface (BCI). This method uses a signal-dependent orthogonal transform, referred to as linear prediction singular value decomposition (LP-SVD), for feature extraction. The transform defines the mapping as the left singular vectors of the LP coefficient filter impulse response matrix. Using a logistic tree-based model classifier; the extracted features are classified into one of four motor imagery movements. The proposed approach was first benchmarked against two related state-of-the-art feature extraction approaches, namely, discrete cosine transform (DCT) and adaptive autoregressive (AAR)-based methods. By achieving an accuracy of 67.35%, the LP-SVD approach outperformed the other approaches by large margins (25% compared with DCT and 6 % compared with AAR-based methods). To further improve the discriminatory capability of the extracted features and reduce the computational complexity, we enlarged the extracted feature subset by incorporating two extra features, namely, Q- and the Hotelling’s \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$T^{2}$ \\end{document} statistics of the transformed EEG and introduced a new EEG channel selection method. The performance of the EEG classification based on the expanded feature set and channel selection method was compared with that of a number of the state-of-the-art classification methods previously reported with the BCI IIIa competition data set. Our method came second with an average accuracy of 81.38%. PMID:27170898

  19. Towards an Ontology for the Global Geodynamics Project: Automated Extraction of Resource Descriptions from an XML-Based Data Model

    Science.gov (United States)

    Lumb, L. I.; Aldridge, K. D.

    2005-12-01

    Using the Earth Science Markup Language (ESML), an XML-based data model for the Global Geodynamics Project (GGP) was recently introduced [Lumb & Aldridge, Proc. HPCS 2005, Kotsireas & Stacey, eds., IEEE, 2005, 216-222]. This data model possesses several key attributes -i.e., it: makes use of XML schema; supports semi-structured ASCII format files; includes Earth Science affinities; and is on track for compliance with emerging Grid computing standards (e.g., the Global Grid Forum's Data Format Description Language, DFDL). Favorable attributes notwithstanding, metadata (i.e., data about data) was identified [Lumb & Aldridge, 2005] as a key challenge for progress in enabling the GGP for Grid computing. Even in projects of small-to-medium scale like the GGP, the manual introduction of metadata has the potential to be the rate-determining metric for progress. Fortunately, an automated approach for metadata introduction has recently emerged. Based on Gleaning Resource Descriptions from Dialects of Languages (GRDDL, http://www.w3.org/2004/01/rdxh/spec), this bottom-up approach allows for the extraction of Resource Description Format (RDF) representations from the XML-based data model (i.e., the ESML representation of GGP data) subject to rules of transformation articulated via eXtensible Stylesheet Language Transformations (XSLT). In addition to introducing relationships into the GGP data model, and thereby addressing the metadata requirement, the syntax and semantics of RDF comprise a requisite for a GGP ontology - i.e., ``the common words and concepts (the meaning) used to describe and represent an area of knowledge'' [Daconta et al., The Semantic Web, Wiley, 2003]. After briefly reviewing the XML-based model for the GGP, attention focuses on the automated extraction of an RDF representation via GRDDL with XSLT-delineated templates. This bottom-up approach, in tandem with a top-down approach based on the Protege integrated development environment for ontologies (http

  20. Zener Diode Compact Model Parameter Extraction Using Xyce-Dakota Optimization.

    Energy Technology Data Exchange (ETDEWEB)

    Buchheit, Thomas E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wilcox, Ian Zachary [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sandoval, Andrew J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reza, Shahed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    This report presents a detailed process for compact model parameter extraction for DC circuit Zener diodes. Following the traditional approach of Zener diode parameter extraction, circuit model representation is defined and then used to capture the different operational regions of a real diode's electrical behavior. The circuit model contains 9 parameters represented by resistors and characteristic diodes as circuit model elements. The process of initial parameter extraction, the identification of parameter values for the circuit model elements, is presented in a way that isolates the dependencies between certain electrical parameters and highlights both the empirical nature of the extraction and portions of the real diode physical behavior which of the parameters are intended to represent. Optimization of the parameters, a necessary part of a robost parameter extraction process, is demonstrated using a 'Xyce-Dakota' workflow, discussed in more detail in the report. Among other realizations during this systematic approach of electrical model parameter extraction, non-physical solutions are possible and can be difficult to avoid because of the interdependencies between the different parameters. The process steps described are fairly general and can be leveraged for other types of semiconductor device model extractions. Also included in the report are recommendations for experiment setups for generating optimum dataset for model extraction and the Parameter Identification and Ranking Table (PIRT) for Zener diodes.

  1. Feature extraction for face recognition via Active Shape Model (ASM) and Active Appearance Model (AAM)

    Science.gov (United States)

    Iqtait, M.; Mohamad, F. S.; Mamat, M.

    2018-03-01

    Biometric is a pattern recognition system which is used for automatic recognition of persons based on characteristics and features of an individual. Face recognition with high recognition rate is still a challenging task and usually accomplished in three phases consisting of face detection, feature extraction, and expression classification. Precise and strong location of trait point is a complicated and difficult issue in face recognition. Cootes proposed a Multi Resolution Active Shape Models (ASM) algorithm, which could extract specified shape accurately and efficiently. Furthermore, as the improvement of ASM, Active Appearance Models algorithm (AAM) is proposed to extracts both shape and texture of specified object simultaneously. In this paper we give more details about the two algorithms and give the results of experiments, testing their performance on one dataset of faces. We found that the ASM is faster and gains more accurate trait point location than the AAM, but the AAM gains a better match to the texture.

  2. 21 CFR 172.585 - Sugar beet extract flavor base.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Sugar beet extract flavor base. 172.585 Section 172... CONSUMPTION Flavoring Agents and Related Substances § 172.585 Sugar beet extract flavor base. Sugar beet extract flavor base may be safely used in food in accordance with the provisions of this section. (a...

  3. Fault feature extraction method based on local mean decomposition Shannon entropy and improved kernel principal component analysis model

    Directory of Open Access Journals (Sweden)

    Jinlu Sheng

    2016-07-01

    Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.

  4. Modeling and optimization of a utility system containing multiple extractions steam turbines

    International Nuclear Information System (INIS)

    Luo, Xianglong; Zhang, Bingjian; Chen, Ying; Mo, Songping

    2011-01-01

    Complex turbines with multiple controlled and/or uncontrolled extractions are popularly used in the processing industry and cogeneration plants to provide steam of different levels, electric power, and driving power. To characterize thermodynamic behavior under varying conditions, nonlinear mathematical models are developed based on energy balance, thermodynamic principles, and semi-empirical equations. First, the complex turbine is decomposed into several simple turbines from the controlled extraction stages and modeled in series. THM (The turbine hardware model) developing concept is applied to predict the isentropic efficiency of the decomposed simple turbines. Stodola's formulation is also used to simulate the uncontrolled extraction steam parameters. The thermodynamic properties of steam and water are regressed through linearization or piece-wise linearization. Second, comparison between the simulated results using the proposed model and the data in the working condition diagram provided by the manufacturer is conducted over a wide range of operations. The simulation results yield small deviation from the data in the working condition diagram where the maximum modeling error is 0.87% among the compared seven operation conditions. Last, the optimization model of a utility system containing multiple extraction turbines is established and a detailed case is analyzed. Compared with the conventional operation strategy, a maximum of 5.47% of the total operation cost is saved using the proposed optimization model. -- Highlights: → We develop a complete simulation model for steam turbine with multiple extractions. → We test the simulation model using the performance data of commercial turbines. → The simulation error of electric power generation is no more than 0.87%. → We establish a utility system operational optimization model. → The optimal industrial operation scheme featured with 5.47% of cost saving.

  5. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  6. Analysis of the flood extent extraction model and the natural flood influencing factors: A GIS-based and remote sensing analysis

    International Nuclear Information System (INIS)

    Lawal, D U; Matori, A N; Yusuf, K W; Hashim, A M; Balogun, A L

    2014-01-01

    Serious floods have hit the State of Perlis in 2005, 2010, as well as 2011. Perlis is situated in the northern part of Peninsula Malaysia. The floods caused great damage to properties and human lives. There are various methods used in an attempt to provide the most reliable ways to reduce the flood risk and damage to the optimum level by identifying the flood vulnerable zones. The purpose of this paper is to develop a flood extent extraction model based on Minimum Distance Algorithm and to overlay with the natural flood influencing factors considered herein in order to examine the effect of each factor in flood generation. GIS spatial database was created from a geological map, SPOT satellite image, and the topographical map. An attribute database was equally created from field investigations and historical flood areas reports of the study area. The results show a great correlation between the flood extent extraction model and the flood factors

  7. FEATURE EXTRACTION FOR EMG BASED PROSTHESES CONTROL

    Directory of Open Access Journals (Sweden)

    R. Aishwarya

    2013-01-01

    Full Text Available The control of prosthetic limb would be more effective if it is based on Surface Electromyogram (SEMG signals from remnant muscles. The analysis of SEMG signals depend on a number of factors, such as amplitude as well as time- and frequency-domain properties. Time series analysis using Auto Regressive (AR model and Mean frequency which is tolerant to white Gaussian noise are used as feature extraction techniques. EMG Histogram is used as another feature vector that was seen to give more distinct classification. The work was done with SEMG dataset obtained from the NINAPRO DATABASE, a resource for bio robotics community. Eight classes of hand movements hand open, hand close, Wrist extension, Wrist flexion, Pointing index, Ulnar deviation, Thumbs up, Thumb opposite to little finger are taken into consideration and feature vectors are extracted. The feature vectors can be given to an artificial neural network for further classification in controlling the prosthetic arm which is not dealt in this paper.

  8. Theoretical justification of space-mapping-based modeling utilizing a database and on-demand parameter extraction

    DEFF Research Database (Denmark)

    Koziel, Slawomir; Bandler, John W.; Madsen, Kaj

    2006-01-01

    the surrogate, we perform parameter extraction with weighting coefficients dependent on the distance between the point of interest and base points. We provide theoretical results showing that the new methodology can assure any accuracy that is required (provided the base set is dense enough), which...

  9. CURB-BASED STREET FLOOR EXTRACTION FROM MOBILE TERRESTRIAL LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Ibrahim

    2012-07-01

    Full Text Available Mobile terrestrial laser scanners (MTLS produce huge 3D point clouds describing the terrestrial surface, from which objects like different street furniture can be generated. Extraction and modelling of the street curb and the street floor from MTLS point clouds is important for many applications such as right-of-way asset inventory, road maintenance and city planning. The proposed pipeline for the curb and street floor extraction consists of a sequence of five steps: organizing the 3D point cloud and nearest neighbour search; 3D density-based segmentation to segment the ground; morphological analysis to refine out the ground segment; derivative of Gaussian filtering to detect the curb; solving the travelling salesman problem to form a closed polygon of the curb and point-inpolygon test to extract the street floor. Two mobile laser scanning datasets of different scenes are tested with the proposed pipeline. The results of the extracted curb and street floor are evaluated based on a truth data. The obtained detection rates for the extracted street floor for the datasets are 95% and 96.53%. This study presents a novel approach to the detection and extraction of the road curb and the street floor from unorganized 3D point clouds captured by MTLS. It utilizes only the 3D coordinates of the point cloud.

  10. Predictive model for ionic liquid extraction solvents for rare earth elements

    International Nuclear Information System (INIS)

    Grabda, Mariusz; Oleszek, Sylwia; Panigrahi, Mrutyunjay; Kozak, Dmytro; Shibata, Etsuro; Nakamura, Takashi; Eckert, Franck

    2015-01-01

    The purpose of our study was to select the most effective ionic liquid extraction solvents for dysprosium (III) fluoride using a theoretical approach. Conductor-like Screening Model for Real Solvents (COSMO-RS), based on quantum chemistry and the statistical thermodynamics of predefined DyF 3 -ionic liquid systems, was applied to reach the target. Chemical potentials of the salt were predicted in 4,400 different ionic liquids. On the base of these predictions set of ionic liquids’ ions, manifesting significant decrease of the chemical potentials, were selected. Considering the calculated physicochemical properties (hydrophobicity, viscosity) of the ionic liquids containing these specific ions, the most effective extraction solvents for liquid-liquid extraction of DyF 3 were proposed. The obtained results indicate that the COSMO-RS approach can be applied to quickly screen the affinity of any rare earth element for a large number of ionic liquid systems, before extensive experimental tests

  11. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  12. Morphological operation based dense houses extraction from DSM

    OpenAIRE

    Li, Y.; Zhu, L.; Tachibana, K.; Shimamura, H.

    2014-01-01

    This paper presents a method of reshaping and extraction of markers and masks of the dense houses from the DSM based on mathematical morphology (MM). Houses in a digital surface model (DSM) are almost joined together in high-density housing areas, and most segmentation methods cannot completely separate them. We propose to label the markers of the buildings firstly and segment them into masks by watershed then. To avoid detecting more than one marker for a house or no marker at all d...

  13. Design of guanidinium ionic liquid based microwave-assisted extraction for the efficient extraction of Praeruptorin A from Radix peucedani.

    Science.gov (United States)

    Ding, Xueqin; Li, Li; Wang, Yuzhi; Chen, Jing; Huang, Yanhua; Xu, Kaijia

    2014-12-01

    A series of novel tetramethylguanidinium ionic liquids and hexaalkylguanidinium ionic liquids have been synthesized based on 1,1,3,3-tetramethylguanidine. The structures of the ionic liquids were confirmed by (1)H NMR spectroscopy and mass spectrometry. A green guanidinium ionic liquid based microwave-assisted extraction method has been developed with these guanidinium ionic liquids for the effective extraction of Praeruptorin A from Radix peucedani. After extraction, reversed-phase high-performance liquid chromatography with UV detection was employed for the analysis of Praeruptorin A. Several significant operating parameters were systematically optimized by single-factor and L9 (3(4)) orthogonal array experiments. The amount of Praeruptorin A extracted by [1,1,3,3-tetramethylguanidine]CH2CH(OH)COOH is the highest, reaching 11.05 ± 0.13 mg/g. Guanidinium ionic liquid based microwave-assisted extraction presents unique advantages in Praeruptorin A extraction compared with guanidinium ionic liquid based maceration extraction, guanidinium ionic liquid based heat reflux extraction and guanidinium ionic liquid based ultrasound-assisted extraction. The precision, stability, and repeatability of the process were investigated. The mechanisms of guanidinium ionic liquid based microwave-assisted extraction were researched by scanning electron microscopy and IR spectroscopy. All the results show that guanidinium ionic liquid based microwave-assisted extraction has a huge potential in the extraction of bioactive compounds from complex samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  15. Coupling a transient solvent extraction module with the separations and safeguards performance model.

    Energy Technology Data Exchange (ETDEWEB)

    DePaoli, David W. (Oak Ridge National Laboratory, Oak Ridge, TN); Birdwell, Joseph F. (Oak Ridge National Laboratory, Oak Ridge, TN); Gauld, Ian C. (Oak Ridge National Laboratory, Oak Ridge, TN); Cipiti, Benjamin B.; de Almeida, Valmor F. (Oak Ridge National Laboratory, Oak Ridge, TN)

    2009-10-01

    A number of codes have been developed in the past for safeguards analysis, but many are dated, and no single code is able to cover all aspects of materials accountancy, process monitoring, and diversion scenario analysis. The purpose of this work was to integrate a transient solvent extraction simulation module developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM), developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The SSPM was designed for materials accountancy and process monitoring analyses, but previous versions of the code have included limited detail on the chemical processes, including chemical separations. The transient solvent extraction model is based on the ORNL SEPHIS code approach to consider solute build up in a bank of contactors in the PUREX process. Combined, these capabilities yield a more robust transient separations and safeguards model for evaluating safeguards system design. This coupling and initial results are presented. In addition, some observations toward further enhancement of separations and safeguards modeling based on this effort are provided, including: items to be addressed in integrating legacy codes, additional improvements needed for a fully functional solvent extraction module, and recommendations for future integration of other chemical process modules.

  16. Knickzone Extraction Tool (KET – A new ArcGIS toolset for automatic extraction of knickzones from a DEM based on multi-scale stream gradients

    Directory of Open Access Journals (Sweden)

    Zahra Tuba

    2017-04-01

    Full Text Available Extraction of knickpoints or knickzones from a Digital Elevation Model (DEM has gained immense significance owing to the increasing implications of knickzones on landform development. However, existing methods for knickzone extraction tend to be subjective or require time-intensive data processing. This paper describes the proposed Knickzone Extraction Tool (KET, a new raster-based Python script deployed in the form of an ArcGIS toolset that automates the process of knickzone extraction and is both fast and more user-friendly. The KET is based on multi-scale analysis of slope gradients along a river course, where any locally steep segment (knickzone can be extracted as an anomalously high local gradient. We also conducted a comparative analysis of the KET and other contemporary knickzone identification techniques. The relationship between knickzone distribution and its morphometric characteristics are also examined through a case study of a mountainous watershed in Japan.

  17. Modeling of Pu(IV) extraction and HNO3 speciation in nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    De-Sio, S.

    2012-01-01

    The PUREX process is a solvent extraction method dedicated to the reprocessing of irradiated nuclear fuel in order to recover pure uranium and plutonium from aqueous solutions of concentrated nitric acid. The tri-n-butylphosphate (TBP) is used as the extractant in the organic phase. The aim of this thesis work was to improve the modeling of liquid-liquid extraction media in nuclear fuel reprocessing. First, Raman and 14 N NMR measurements, coupled with theoretical calculations based on simple solutions theory and BIMSA modeling, were performed in order to get a better understanding of nitric acid dissociation in binary and ternary solutions. Then, Pu(IV) speciation in TBP after extraction from low nitric acid concentrations was investigated by EXAFS and vis-NIR spectroscopies. We were able to show evidence of the extraction of Pu(IV) hydrolyzed species into the organic phase. A new structural study was conducted on An(VI)/TBP and An(IV)/TBP complexes by coupling EXAFS measurements with DFT calculations. Finally, extraction isotherms modeling was performed on the Pu(IV)/HNO 3 /H 2 O/TBP 30%/dodecane system (with Pu at tracer scale) by taking into account deviation from ideal behaviour in both organic and aqueous phases. The best modeling was obtained when considering three plutonium (IV) complexes in the organic phase: Pu(OH) 2 (NO 3 ) 2 (TBP) 2 , Pu(NO 3 ) 4 (TBP) 2 and Pu(NO 3 ) 4 (TBP) 3 . (author) [fr

  18. Predictive thermodynamic models for liquid--liquid extraction of single, binary and ternary lanthanides and actinides

    International Nuclear Information System (INIS)

    Hoh, Y.C.

    1977-03-01

    Chemically based thermodynamic models to predict the distribution coefficients and the separation factors for the liquid--liquid extraction of lanthanides-organophosphorus compounds were developed by assuming that the quotient of the activity coefficients of each species varies slightly with its concentrations, by using aqueous lanthanide or actinide complexes stoichiometric stability constants expressed as its degrees of formation, by making use of the extraction mechanism and the equilibrium constant for the extraction reaction. For a single component system, the thermodynamic model equations which predict the distribution coefficients, are dependent on the free organic concentration, the equilibrated ligand and hydrogen ion concentrations, the degree of formation, and on the extraction mechanism. For a binary component system, the thermodynamic model equation which predicts the separation factors is the same for all cases. This model equation is dependent on the degrees of formation of each species in their binary system and can be used in a ternary component system to predict the separation factors for the solutes relative to each other

  19. Extracting Models in Single Molecule Experiments

    Science.gov (United States)

    Presse, Steve

    2013-03-01

    Single molecule experiments can now monitor the journey of a protein from its assembly near a ribosome to its proteolytic demise. Ideally all single molecule data should be self-explanatory. However data originating from single molecule experiments is particularly challenging to interpret on account of fluctuations and noise at such small scales. Realistically, basic understanding comes from models carefully extracted from the noisy data. Statistical mechanics, and maximum entropy in particular, provide a powerful framework for accomplishing this task in a principled fashion. Here I will discuss our work in extracting conformational memory from single molecule force spectroscopy experiments on large biomolecules. One clear advantage of this method is that we let the data tend towards the correct model, we do not fit the data. I will show that the dynamical model of the single molecule dynamics which emerges from this analysis is often more textured and complex than could otherwise come from fitting the data to a pre-conceived model.

  20. Region-Based Building Rooftop Extraction and Change Detection

    Science.gov (United States)

    Tian, J.; Metzlaff, L.; d'Angelo, P.; Reinartz, P.

    2017-09-01

    Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs) to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  1. REGION-BASED BUILDING ROOFTOP EXTRACTION AND CHANGE DETECTION

    Directory of Open Access Journals (Sweden)

    J. Tian

    2017-09-01

    Full Text Available Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  2. Extracting Optical Fiber Background from Surface-Enhanced Raman Spectroscopy Spectra Based on Bi-Objective Optimization Modeling.

    Science.gov (United States)

    Huang, Jie; Shi, Tielin; Tang, Zirong; Zhu, Wei; Liao, Guanglan; Li, Xiaoping; Gong, Bo; Zhou, Tengyuan

    2017-08-01

    We propose a bi-objective optimization model for extracting optical fiber background from the measured surface-enhanced Raman spectroscopy (SERS) spectrum of the target sample in the application of fiber optic SERS. The model is built using curve fitting to resolve the SERS spectrum into several individual bands, and simultaneously matching some resolved bands with the measured background spectrum. The Pearson correlation coefficient is selected as the similarity index and its maximum value is pursued during the spectral matching process. An algorithm is proposed, programmed, and demonstrated successfully in extracting optical fiber background or fluorescence background from the measured SERS spectra of rhodamine 6G (R6G) and crystal violet (CV). The proposed model not only can be applied to remove optical fiber background or fluorescence background for SERS spectra, but also can be transferred to conventional Raman spectra recorded using fiber optic instrumentation.

  3. A Full-Body Layered Deformable Model for Automatic Model-Based Gait Recognition

    Science.gov (United States)

    Lu, Haiping; Plataniotis, Konstantinos N.; Venetsanopoulos, Anastasios N.

    2007-12-01

    This paper proposes a full-body layered deformable model (LDM) inspired by manually labeled silhouettes for automatic model-based gait recognition from part-level gait dynamics in monocular video sequences. The LDM is defined for the fronto-parallel gait with 22 parameters describing the human body part shapes (widths and lengths) and dynamics (positions and orientations). There are four layers in the LDM and the limbs are deformable. Algorithms for LDM-based human body pose recovery are then developed to estimate the LDM parameters from both manually labeled and automatically extracted silhouettes, where the automatic silhouette extraction is through a coarse-to-fine localization and extraction procedure. The estimated LDM parameters are used for model-based gait recognition by employing the dynamic time warping for matching and adopting the combination scheme in AdaBoost.M2. While the existing model-based gait recognition approaches focus primarily on the lower limbs, the estimated LDM parameters enable us to study full-body model-based gait recognition by utilizing the dynamics of the upper limbs, the shoulders and the head as well. In the experiments, the LDM-based gait recognition is tested on gait sequences with differences in shoe-type, surface, carrying condition and time. The results demonstrate that the recognition performance benefits from not only the lower limb dynamics, but also the dynamics of the upper limbs, the shoulders and the head. In addition, the LDM can serve as an analysis tool for studying factors affecting the gait under various conditions.

  4. Enhancement of Twins Fetal ECG Signal Extraction Based on Hybrid Blind Extraction Techniques

    Directory of Open Access Journals (Sweden)

    Ahmed Kareem Abdullah

    2017-07-01

    Full Text Available ECG machines are noninvasive system used to measure the heartbeat signal. It’s very important to monitor the fetus ECG signals during pregnancy to check the heat activity and to detect any problem early before born, therefore the monitoring of ECG signals have clinical significance and importance. For multi-fetal pregnancy case the classical filtering algorithms are not sufficient to separate the ECG signals between mother and fetal. In this paper the mixture consists of mixing from three ECG signals, the first signal is the mother ECG (M-ECG signal, second signal the Fetal-1 ECG (F1-ECG, and third signal is the Fetal-2 ECG (F2-ECG, these signals are extracted based on modified blind source extraction (BSE techniques. The proposed work based on hybridization between two BSE techniques to ensure that the extracted signals separated well. The results demonstrate that the proposed work very efficiently to extract the useful ECG signals

  5. Distributed Parallel Endmember Extraction of Hyperspectral Data Based on Spark

    Directory of Open Access Journals (Sweden)

    Zebin Wu

    2016-01-01

    Full Text Available Due to the increasing dimensionality and volume of remotely sensed hyperspectral data, the development of acceleration techniques for massive hyperspectral image analysis approaches is a very important challenge. Cloud computing offers many possibilities of distributed processing of hyperspectral datasets. This paper proposes a novel distributed parallel endmember extraction method based on iterative error analysis that utilizes cloud computing principles to efficiently process massive hyperspectral data. The proposed method takes advantage of technologies including MapReduce programming model, Hadoop Distributed File System (HDFS, and Apache Spark to realize distributed parallel implementation for hyperspectral endmember extraction, which significantly accelerates the computation of hyperspectral processing and provides high throughput access to large hyperspectral data. The experimental results, which are obtained by extracting endmembers of hyperspectral datasets on a cloud computing platform built on a cluster, demonstrate the effectiveness and computational efficiency of the proposed method.

  6. Mathematical modeling of processes of nuclear fuel extraction reprocessing

    International Nuclear Information System (INIS)

    Rozen, A.M.; Zel'venskij, M.Ya.

    1977-01-01

    A mathematical model of extraction process in the mixer-settlers is given which describes both a simple step process and extraction complicated with chemical reactions. The extraction equilibrium is described on the basis of theoretical data on the extraction mechanism using the mass action law and contains one empirical constant. The equations for concentration extraction constants of uranium(6) and uranium(4), plutonium(3, 4, 6), neptunium(4, 6) and zirconium(4) depending on the solution ion strength are given, which are obtained by treatment of numerous experimental data. On the example of the reductive reextraction process it is shown that there is a good coincidence of the calculated results of the suggested model with the experimental ones by Mc-Kay. By the method of matematical modeling the uranium and plutonium separation processes without reducers are investigated. The purification improvement of uranium extract, for example, is followed by the deterioration of the data on the other end of cascade i.e. purification of plutonium reextract from uranium and vice versa. To obtain a high grade of separation is possible only in comparatively long cascades and, besides, the regime parameters should be precisely observed

  7. Antithrombotic Potential of Tormentil Extract in Animal Models

    Directory of Open Access Journals (Sweden)

    Natalia Marcinczyk

    2017-08-01

    Full Text Available Potentilla species that have been investigated so far display pharmacological activity mainly due to the presence of polyphenols. Recently, it was shown that polyphenol-rich extract from rhizome of Potentilla erecta (tormentil extract affects the metabolism of arachidonic acid and exerts both anti-inflammatory and anti-oxidant activities, suggesting a possible effect on thrombosis. Accordingly, the aim of the study was to evaluate the effect of tormentil extract on haemostasis in a rat model of thrombosis. Lyophilized water-methanol extract from P. erecta rhizome was administrated per os for 14 days in doses of 100, 200, and 400 mg/kg in a volume of 2 mL/kg in a 5% water solution of gummi arabici (VEH. In the in vivo experiment an electrically induced carotid artery thrombosis model with blood flow monitoring was used in Wistar rats. Collected blood samples were analyzed ex vivo functionally and biochemically for changes in haemostasis. Tormentil extract (400 mg/kg significantly decreased thrombus weight and prolonged the time to carotid artery occlusion and bleeding time without changes in the blood pressure. In the ex vivo experiment tormentil extract (400 mg/kg reduced thromboxane production and decreased t-PA activity, while total t-PA concentration, as well as total PAI-1 concentration and PAI-1 activity remained unchanged. Furthermore, tormentil extract (400 mg/kg decreased bradykinin concentration and shortened the time to reach maximal optical density during fibrin generation. Prothrombin time, activated partial thromboplastin time, QUICK index, fibrinogen level, and collagen-induced aggregation remained unchanged. To investigate the involvement of platelets in the antithrombotic effect of tormentil, the extract was administrated per os for 2 days to mice and irreversible platelets activation after ferric chloride induced thrombosis was evaluated under intravital conditions using confocal microscopy system. In this model tormentil

  8. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  9. Use of a Latent Topic Model for Characteristic Extraction from Health Checkup Questionnaire Data.

    Science.gov (United States)

    Hatakeyama, Y; Miyano, I; Kataoka, H; Nakajima, N; Watabe, T; Yasuda, N; Okuhara, Y

    2015-01-01

    When patients complete questionnaires during health checkups, many of their responses are subjective, making topic extraction difficult. Therefore, the purpose of this study was to develop a model capable of extracting appropriate topics from subjective data in questionnaires conducted during health checkups. We employed a latent topic model to group the lifestyle habits of the study participants and represented their responses to items on health checkup questionnaires as a probability model. For the probability model, we used latent Dirichlet allocation to extract 30 topics from the questionnaires. According to the model parameters, a total of 4381 study participants were then divided into groups based on these topics. Results from laboratory tests, including blood glucose level, triglycerides, and estimated glomerular filtration rate, were compared between each group, and these results were then compared with those obtained by hierarchical clustering. If a significant (p topic model and hierarchical clustering grouping revealed that, in the latent topic model method, a small group of participants who reported having subjective signs of urinary disorder were allocated to a single group. The latent topic model is useful for extracting characteristics from a small number of groups from questionnaires with a large number of items. These results show that, in addition to chief complaints and history of past illness, questionnaire data obtained during medical checkups can serve as useful judgment criteria for assessing the conditions of patients.

  10. Batch extraction modeling of jatropha oil using ethanol and n-hexane

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, Alessandro Araujo; Martins, Marcio Aredes [Universidade Federal de Vicosa (DEA/UFV), MG (Brazil). Dept. de Engenharia Agricola], E-mail: aredes@ufv.br; Santos, Karine Tennis dos [Universidade Federal de Vicosa (DEQ/UFV), MG (Brazil). Dept. de Quimica; Carneiro, Angelica Cassia de Oliveira [Universidade Federal de Vicosa (DFT/UFV), MG (Brazil). Dept. de Fitotecnia; Perez, Ronaldo [Universidade Federal de Vicosa (DTA/UFV), MG (Brazil). Dept. de Tecnologia de Alimentos

    2008-07-01

    Jatropha curcas (Linnaeus.) has been considered as a promising alternative for rainfall regimes from 200 to over 1500 mm per annum. The seed and the oil have many applications, such as purgative, in the treatment of skin infections and rheumatism, in the control of insects, mollusks and fungi, for diesel engines lubricants, in soap and paint production, and mainly for biodiesel production. New technologies should be developed to accomplish the oil production in large scale, since the Brazilian Biodiesel Program stimulates the oilseeds productions. In large scale oil production, the oil is obtained using solvent extraction. The solvent widely used for oil extraction is the n-hexane mainly because of its low vaporization temperature and selectivity to the lipidic fraction. However, the use of n-hexane in small capacity plants makes the process expensive because of high operating losses. Alcohols were exhaustively studied at pilot and industrial scales extraction plants. Ethanol is an efficient and advantageous extraction solvent for oilseeds, being an attractive alternative to extraction grade n-hexane. Therefore, the objective of the present work is to model and to compare the extraction kinetics of jatropha oil by using ethanol and n-hexane. Extractions experiments were performed in a batch extractor at 45 deg C using a liquid-to-solvent ratio of 15:1 (mL solvent/g sample). Samples were taken every 15 min, and extraction time was to 2 h. The kinetics of oil extraction data were fitted to the models reported in literature. For n-hexane and ethanol extractions, the fractional residual oil at 120 minutes was 0.314 and 0.0538, respectively. The models reported in literature were suitable to describe the n-hexane extraction, especially the Duggal model. However, those models were not adequate the model the ethanol extraction (author)

  11. Coupling a Transient Solvent Extraction Module with the Separations and Safeguards Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    de Almeida, Valmor F [ORNL; Birdwell Jr, Joseph F [ORNL; DePaoli, David W [ORNL; Gauld, Ian C [ORNL

    2009-10-01

    A past difficulty in safeguards design for reprocessing plants is that no code existed for analysis and evaluation of the design. A number of codes have been developed in the past, but many are dated, and no single code is able to cover all aspects of materials accountancy, process monitoring, and diversion scenario analysis. The purpose of this work was to integrate a transient solvent extraction simulation module developed at Oak Ridge National Laboratory, with the SSPM Separations and Safeguards Performance Model, developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The SSPM was designed for materials accountancy and process monitoring analyses, but previous versions of the code have included limited detail on the chemical processes, including chemical separations. The transient solvent extraction model is based on the ORNL SEPHIS code approach to consider solute build up in a bank of contactors in the PUREX process. Combined, these capabilities yield a much more robust transient separations and safeguards model for evaluating safeguards system design. This coupling and the initial results are presented. In addition, some observations toward further enhancement of separations and safeguards modeling based on this effort are provided, including: items to be addressed in integrating legacy codes, additional improvements needed for a fully functional solvent extraction module, and recommendations for future integration of other chemical process modules.

  12. Efficient extraction of drainage networks from massive, radar-based elevation models with least cost path search

    Directory of Open Access Journals (Sweden)

    M. Metz

    2011-02-01

    Full Text Available The availability of both global and regional elevation datasets acquired by modern remote sensing technologies provides an opportunity to significantly improve the accuracy of stream mapping, especially in remote, hard to reach regions. Stream extraction from digital elevation models (DEMs is based on computation of flow accumulation, a summary parameter that poses performance and accuracy challenges when applied to large, noisy DEMs generated by remote sensing technologies. Robust handling of DEM depressions is essential for reliable extraction of connected drainage networks from this type of data. The least-cost flow routing method implemented in GRASS GIS as the module r.watershed was redesigned to significantly improve its speed, functionality, and memory requirements and make it an efficient tool for stream mapping and watershed analysis from large DEMs. To evaluate its handling of large depressions, typical for remote sensing derived DEMs, three different methods were compared: traditional sink filling, impact reduction approach, and least-cost path search. The comparison was performed using the Shuttle Radar Topographic Mission (SRTM and Interferometric Synthetic Aperture Radar for Elevation (IFSARE datasets covering central Panama at 90 m and 10 m resolutions, respectively. The accuracy assessment was based on ground control points acquired by GPS and reference points digitized from Landsat imagery along segments of selected Panamanian rivers. The results demonstrate that the new implementation of the least-cost path method is significantly faster than the original version, can cope with massive datasets, and provides the most accurate results in terms of stream locations validated against reference points.

  13. Hybrid Model of Content Extraction

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah

    2012-01-01

    We present a hybrid model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features like link density and text distribution across the node to predict...... significance of the node towards overall content provided by the document. Once significance of the nodes is determined, the formatting characteristics like fonts, styles and the position of the nodes are evaluated to identify the nodes with similar formatting as compared to the significant nodes. The proposed...

  14. Support patient search on pathology reports with interactive online learning based data extraction.

    Science.gov (United States)

    Zheng, Shuai; Lu, James J; Appin, Christina; Brat, Daniel; Wang, Fusheng

    2015-01-01

    Structural reporting enables semantic understanding and prompt retrieval of clinical findings about patients. While synoptic pathology reporting provides templates for data entries, information in pathology reports remains primarily in narrative free text form. Extracting data of interest from narrative pathology reports could significantly improve the representation of the information and enable complex structured queries. However, manual extraction is tedious and error-prone, and automated tools are often constructed with a fixed training dataset and not easily adaptable. Our goal is to extract data from pathology reports to support advanced patient search with a highly adaptable semi-automated data extraction system, which can adjust and self-improve by learning from a user's interaction with minimal human effort. We have developed an online machine learning based information extraction system called IDEAL-X. With its graphical user interface, the system's data extraction engine automatically annotates values for users to review upon loading each report text. The system analyzes users' corrections regarding these annotations with online machine learning, and incrementally enhances and refines the learning model as reports are processed. The system also takes advantage of customized controlled vocabularies, which can be adaptively refined during the online learning process to further assist the data extraction. As the accuracy of automatic annotation improves overtime, the effort of human annotation is gradually reduced. After all reports are processed, a built-in query engine can be applied to conveniently define queries based on extracted structured data. We have evaluated the system with a dataset of anatomic pathology reports from 50 patients. Extracted data elements include demographical data, diagnosis, genetic marker, and procedure. The system achieves F-1 scores of around 95% for the majority of tests. Extracting data from pathology reports could enable

  15. Modeling stress–strain state of rock mass under mining of complex-shape extraction pillar

    Science.gov (United States)

    Fryanov, VN; Pavlova, LD

    2018-03-01

    Based on the results of numerical modeling of stresses and strains in rock mass, geomechanical parameters of development workings adjacent to coal face operation area are provided for multi-entry preparation and extraction of flat seams with production faces of variable length. The negative effects on the geomechanical situation during the transition from the longwall to shortwall mining in a fully mechanized extraction face are found.

  16. Alternative Bio-Based Solvents for Extraction of Fat and Oils: Solubility Prediction, Global Yield, Extraction Kinetics, Chemical Composition and Cost of Manufacturing

    Directory of Open Access Journals (Sweden)

    Anne-Gaëlle Sicaire

    2015-04-01

    Full Text Available The present study was designed to evaluate the performance of alternative bio-based solvents, more especially 2-methyltetrahydrofuran, obtained from crop’s byproducts for the substitution of petroleum solvents such as hexane in the extraction of fat and oils for food (edible oil and non-food (bio fuel applications. First a solvent selection as well as an evaluation of the performance was made with Hansen Solubility Parameters and the COnductor-like Screening MOdel for Realistic Solvation (COSMO-RS simulations. Experiments were performed on rapeseed oil extraction at laboratory and pilot plant scale for the determination of lipid yields, extraction kinetics, diffusion modeling, and complete lipid composition in term of fatty acids and micronutrients (sterols, tocopherols and tocotrienols. Finally, economic and energetic evaluations of the process were conducted to estimate the cost of manufacturing using 2-methyltetrahydrofuran (MeTHF as alternative solvent compared to hexane as petroleum solvent.

  17. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  18. Simulation-based Extraction of Key Material Parameters from Atomic Force Microscopy

    Science.gov (United States)

    Alsafi, Huseen; Peninngton, Gray

    Models for the atomic force microscopy (AFM) tip and sample interaction contain numerous material parameters that are often poorly known. This is especially true when dealing with novel material systems or when imaging samples that are exposed to complicated interactions with the local environment. In this work we use Monte Carlo methods to extract sample material parameters from the experimental AFM analysis of a test sample. The parameterized theoretical model that we use is based on the Virtual Environment for Dynamic AFM (VEDA) [1]. The extracted material parameters are then compared with the accepted values for our test sample. Using this procedure, we suggest a method that can be used to successfully determine unknown material properties in novel and complicated material systems. We acknowledge Fisher Endowment Grant support from the Jess and Mildred Fisher College of Science and Mathematics,Towson University.

  19. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Xiong, D

    2001-01-01

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  20. Extracting falsifiable predictions from sloppy models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Casey, Fergal P; Waterfall, Joshua J; Myers, Christopher R; Sethna, James P

    2007-12-01

    Successful predictions are among the most compelling validations of any model. Extracting falsifiable predictions from nonlinear multiparameter models is complicated by the fact that such models are commonly sloppy, possessing sensitivities to different parameter combinations that range over many decades. Here we discuss how sloppiness affects the sorts of data that best constrain model predictions, makes linear uncertainty approximations dangerous, and introduces computational difficulties in Monte-Carlo uncertainty analysis. We also present a useful test problem and suggest refinements to the standards by which models are communicated.

  1. Modeling of the Kinetics of Supercritical Fluid Extraction of Lipids from Microalgae with Emphasis on Extract Desorption

    Directory of Open Access Journals (Sweden)

    Helena Sovová

    2016-05-01

    Full Text Available Microalgae contain valuable biologically active lipophilic substances such as omega-3 fatty acids and carotenoids. In contrast to the recovery of vegetable oils from seeds, where the extraction with supercritical CO2 is used as a mild and selective method, economically viable application of this method on similarly soluble oils from microalgae requires, in most cases, much higher pressure. This paper presents and verifies hypothesis that this difference is caused by high adsorption capacity of microalgae. Under the pressures usually applied in supercritical fluid extraction from plants, microalgae bind a large fraction of the extracted oil, while under extremely high CO2 pressures their adsorption capacity diminishes and the extraction rate depends on oil solubility in supercritical CO2. A mathematical model for the extraction from microalgae was derived and applied to literature data on the extraction kinetics in order to determine model parameters.

  2. Toxicity Thresholds Based on EDTA Extractable Nickel and Barley Root Elongation in Chinese Soils

    Directory of Open Access Journals (Sweden)

    Guangyun Zhu

    2018-04-01

    Full Text Available The uncertainty in the risk assessment of trace metal elements in soils when total metal contents are used can be decreased by assessing their availability and/or extractability when the soils have a high background value or different sources of trace metal elements. In this study, the added water-soluble nickel (Ni toxicity to barley root elongation was studied in 17 representative Chinese soil samples with and without artificial rainwater leaching. The extractability of added Ni in soils was estimated by three sequential extractions with ethylenediaminetetraacetic acid (EDTA. The results showed that the effective concentration of EDTA extractable Ni (EC50, which caused 50% inhibition of barley root elongation, ranged from 46 to 1019 mg/kg in unleached soils and 24 to 1563 mg/kg in leached soils. Regression models for EDTA extractable Ni and total Ni added to soils against soil properties indicated that EDTA extractable Ni was significantly correlated with the total Ni added to soils and that pH was the most important control factor. Regression models for toxicity thresholds based on EDTA extractable Ni against soil properties showed that soil citrate dithionate extractable Fe was more important than soil pH in predicting Ni toxicity. These results can be used to accurately assess the risk of contaminated soils with high background values and/or different Ni sources.

  3. [Research on modeling method to analyze Lonicerae Japonicae Flos extraction process with online MEMS-NIR based on two types of error detection theory].

    Science.gov (United States)

    Du, Chen-Zhao; Wu, Zhi-Sheng; Zhao, Na; Zhou, Zheng; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2016-10-01

    To establish a rapid quantitative analysis method for online monitoring of chlorogenic acid in aqueous solution of Lonicera Japonica Flos extraction by using micro-electromechanical near infrared spectroscopy (MEMS-NIR). High performance liquid chromatography(HPLC) was used as reference method.Kennard-Stone (K-S) algorithm was used to divide sample sets, and partial least square(PLS) regression was adopted to establish the multivariate analysis model between the HPLC analysis contents and NIR spectra. The synergy interval partial least squares (SiPLS) was used to selected modeling waveband to establish PLS models. RPD was used to evaluate the prediction performance of the models. MDLs was calculated based on two types of error detection theory, on-line analytical modeling approach of Lonicera Japonica Flos extraction process was expressed scientifically by MDL. The result shows that the model established by multiplicative scatter correction(MSC) was the best, with the root mean square with cross validation(RMSECV), root mean square error of correction(RMSEC) and root mean square error of prediction(RMSEP) of chlorogenic acid as 1.707, 1.489, 2.362, respectively, the determination coefficient of the calibration model was 0.998 5, and the determination coefficient of the prediction was 0.988 1.The value of RPD is 9.468.The MDL (0.042 15 g•L⁻¹) selected by SiPLS is less than the original,which demonstrated that SiPLS was beneficial to improve the prediction performance of the model. In this study, a more accurate expression of the prediction performance of the model from the two types of error detection theory, to further illustrate MEMS-NIR spectroscopy can be used for on-line monitoring of Lonicera Japonica Flos extraction process. Copyright© by the Chinese Pharmaceutical Association.

  4. Extraction of Trivalent Actinides and Lanthanides from Californium Campaign Rework Solution Using TODGA-based Solvent Extraction System

    Energy Technology Data Exchange (ETDEWEB)

    Benker, Dennis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Delmau, Laetitia Helene [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dryman, Joshua Cory [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-07-01

    This report presents the studies carried out to demonstrate the possibility of quantitatively extracting trivalent actinides and lanthanides from highly acidic solutions using a neutral ligand-based solvent extraction system. These studies stemmed from the perceived advantage of such systems over cationexchange- based solvent extraction systems that require an extensive feed adjustment to make a low-acid feed. The targeted feed solutions are highly acidic aqueous phases obtained after the dissolution of curium targets during a californium (Cf) campaign. Results obtained with actual Cf campaign solutions, but highly diluted to be manageable in a glove box, are presented, followed by results of tests run in the hot cells with Cf campaign rework solutions. It was demonstrated that a solvent extraction system based on the tetraoctyl diglycolamide molecule is capable of quantitatively extracting trivalent actinides from highly acidic solutions. This system was validated using actual feeds from a Cf campaign.

  5. ROADS CENTRE-AXIS EXTRACTION IN AIRBORNE SAR IMAGES: AN APPROACH BASED ON ACTIVE CONTOUR MODEL WITH THE USE OF SEMI-AUTOMATIC SEEDING

    Directory of Open Access Journals (Sweden)

    R. G. Lotte

    2013-05-01

    Full Text Available Research works dealing with computational methods for roads extraction have considerably increased in the latest two decades. This procedure is usually performed on optical or microwave sensors (radar imagery. Radar images offer advantages when compared to optical ones, for they allow the acquisition of scenes regardless of atmospheric and illumination conditions, besides the possibility of surveying regions where the terrain is hidden by the vegetation canopy, among others. The cartographic mapping based on these images is often manually accomplished, requiring considerable time and effort from the human interpreter. Maps for detecting new roads or updating the existing roads network are among the most important cartographic products to date. There are currently many studies involving the extraction of roads by means of automatic or semi-automatic approaches. Each of them presents different solutions for different problems, making this task a scientific issue still open. One of the preliminary steps for roads extraction can be the seeding of points belonging to roads, what can be done using different methods with diverse levels of automation. The identified seed points are interpolated to form the initial road network, and are hence used as an input for an extraction method properly speaking. The present work introduces an innovative hybrid method for the extraction of roads centre-axis in a synthetic aperture radar (SAR airborne image. Initially, candidate points are fully automatically seeded using Self-Organizing Maps (SOM, followed by a pruning process based on specific metrics. The centre-axis are then detected by an open-curve active contour model (snakes. The obtained results were evaluated as to their quality with respect to completeness, correctness and redundancy.

  6. Immunomodulatory Effects of Kuseonwangdogo-Based Mixed Herbal Formula Extracts on a Cyclophosphamide-Induced Immunosuppression Mouse Model

    Directory of Open Access Journals (Sweden)

    Joo Wan Kim

    2018-01-01

    Full Text Available Aim. Kuseonwangdogo is a traditional Korean immunomodulatory polyherbal prescription. However, there are no systemic findings on its complex immunomodulatory effects on in vivo models. In this study, we observed the immunomodulatory effects of Kuseonwangdogo-based mixed herbal formula aqueous extracts (MHFe on cyclophosphamide- (CPA- induced immunosuppression mouse model. Methods. In total, 60 male 6-week-old ICR mice (10 mice/group were selected based on body weight 24 h after the second CPA treatment and used in this experiment. Twelve hours after the end of the last (fourth oral administration of MHFe, the animals were sacrificed. Results. Following CPA treatment, a noticeable decrease in the body, thymus, spleen, and submandibular lymph node (LN weights; white blood cell, red blood cell, platelet number, hemoglobin, and hematocrit concentrations; serum interferon-γ levels; splenic tumor necrosis factor-α, interleukin- (IL- 1β, and IL-10 content; and peritoneal and splenic natural killer cell activities was observed. Depletion of lymphoid cells in the thymic cortex, splenic white pulp, and submandibular LN-related atrophic changes were also observed. However, these CPA-induced myelosuppressive signs were markedly and dose-dependently inhibited by the oral administration of 125, 250, and 500 mg/kg MHFe. Conclusion. MHFe can be a promising, potent immunomodulatory therapeutic agent for various immune disorders.

  7. Model-Based Evaluation of Higher Doses of Rifampin Using a Semimechanistic Model Incorporating Autoinduction and Saturation of Hepatic Extraction.

    Science.gov (United States)

    Chirehwa, Maxwell T; Rustomjee, Roxana; Mthiyane, Thuli; Onyebujoh, Philip; Smith, Peter; McIlleron, Helen; Denti, Paolo

    2016-01-01

    Rifampin is a key sterilizing drug in the treatment of tuberculosis (TB). It induces its own metabolism, but neither the onset nor the extent of autoinduction has been adequately described. Currently, the World Health Organization recommends a rifampin dose of 8 to 12 mg/kg of body weight, which is believed to be suboptimal, and higher doses may potentially improve treatment outcomes. However, a nonlinear increase in exposure may be observed because of saturation of hepatic extraction and hence this should be taken into consideration when a dose increase is implemented. Intensive pharmacokinetic (PK) data from 61 HIV-TB-coinfected patients in South Africa were collected at four visits, on days 1, 8, 15, and 29, after initiation of treatment. Data were analyzed by population nonlinear mixed-effects modeling. Rifampin PKs were best described by using a transit compartment absorption and a well-stirred liver model with saturation of hepatic extraction, including a first-pass effect. Autoinduction was characterized by using an exponential-maturation model: hepatic clearance almost doubled from the baseline to steady state, with a half-life of around 4.5 days. The model predicts that increases in the dose of rifampin result in more-than-linear drug exposure increases as measured by the 24-h area under the concentration-time curve. Simulations with doses of up to 35 mg/kg produced results closely in line with those of clinical trials. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  8. Change of the Extractability of Cadmium Added to Different Soils: Aging Effect and Modeling

    Directory of Open Access Journals (Sweden)

    Xi Zhang

    2018-03-01

    Full Text Available Ethylenediaminetetraacetic acid (EDTA is known to be a chelating agent and has been widely used for estimating the total extractable metals in soil. The effect of aging on EDTA-extractable cadmium (Cd was investigated in five different soils at three Cd concentrations incubated for 180 days. The EDTA-extractable Cd rapidly decreased after incubated during 30–60 days, followed by slow processes, and for 90 days the EDTA-extractable Cd tended to be stable. The decrease in EDTA-extractable Cd may be due to precipitation/nucleation processes, diffusion of Cd into the micropores/mesopores, and occlusion within organic matter in soils. A semi-mechanistic model to predict the extractability of Cd during incubation, based on processes of Cd precipitation/nucleation, diffusion, and occlusion within organic matter, was developed and calibrated. The results showed that the processes of micropore/mesopore diffusion were predominant processes affecting the extractability of Cd added to soils, and were slow. However, the proportions of the processes of precipitation/nucleation and occlusion within organic matter to the non-EDTA-extractable Cd added to soils were only 0.03–21.0% and 0.41–6.95%, respectively. The measured EDTA-extractable Cd from incubated soils were in good agreement with those predicted by the semi-mechanistic model (R2 = 0.829. The results also indicated that soil pH, organic matter, and incubation time were the most important factors affecting Cd aging.

  9. Support patient search on pathology reports with interactive online learning based data extraction

    Directory of Open Access Journals (Sweden)

    Shuai Zheng

    2015-01-01

    Full Text Available Background: Structural reporting enables semantic understanding and prompt retrieval of clinical findings about patients. While synoptic pathology reporting provides templates for data entries, information in pathology reports remains primarily in narrative free text form. Extracting data of interest from narrative pathology reports could significantly improve the representation of the information and enable complex structured queries. However, manual extraction is tedious and error-prone, and automated tools are often constructed with a fixed training dataset and not easily adaptable. Our goal is to extract data from pathology reports to support advanced patient search with a highly adaptable semi-automated data extraction system, which can adjust and self-improve by learning from a user′s interaction with minimal human effort. Methods : We have developed an online machine learning based information extraction system called IDEAL-X. With its graphical user interface, the system′s data extraction engine automatically annotates values for users to review upon loading each report text. The system analyzes users′ corrections regarding these annotations with online machine learning, and incrementally enhances and refines the learning model as reports are processed. The system also takes advantage of customized controlled vocabularies, which can be adaptively refined during the online learning process to further assist the data extraction. As the accuracy of automatic annotation improves overtime, the effort of human annotation is gradually reduced. After all reports are processed, a built-in query engine can be applied to conveniently define queries based on extracted structured data. Results: We have evaluated the system with a dataset of anatomic pathology reports from 50 patients. Extracted data elements include demographical data, diagnosis, genetic marker, and procedure. The system achieves F-1 scores of around 95% for the majority of

  10. Learning-based meta-algorithm for MRI brain extraction.

    Science.gov (United States)

    Shi, Feng; Wang, Li; Gilmore, John H; Lin, Weili; Shen, Dinggang

    2011-01-01

    Multiple-segmentation-and-fusion method has been widely used for brain extraction, tissue segmentation, and region of interest (ROI) localization. However, such studies are hindered in practice by their computational complexity, mainly coming from the steps of template selection and template-to-subject nonlinear registration. In this study, we address these two issues and propose a novel learning-based meta-algorithm for MRI brain extraction. Specifically, we first use exemplars to represent the entire template library, and assign the most similar exemplar to the test subject. Second, a meta-algorithm combining two existing brain extraction algorithms (BET and BSE) is proposed to conduct multiple extractions directly on test subject. Effective parameter settings for the meta-algorithm are learned from the training data and propagated to subject through exemplars. We further develop a level-set based fusion method to combine multiple candidate extractions together with a closed smooth surface, for obtaining the final result. Experimental results show that, with only a small portion of subjects for training, the proposed method is able to produce more accurate and robust brain extraction results, at Jaccard Index of 0.956 +/- 0.010 on total 340 subjects under 6-fold cross validation, compared to those by the BET and BSE even using their best parameter combinations.

  11. GND-PCA-based statistical modeling of diaphragm motion extracted from 4D MRI.

    Science.gov (United States)

    Swastika, Windra; Masuda, Yoshitada; Xu, Rui; Kido, Shoji; Chen, Yen-Wei; Haneishi, Hideaki

    2013-01-01

    We analyzed a statistical model of diaphragm motion using regular principal component analysis (PCA) and generalized N-dimensional PCA (GND-PCA). First, we generate 4D MRI of respiratory motion from 2D MRI using an intersection profile method. We then extract semiautomatically the diaphragm boundary from the 4D-MRI to get subject-specific diaphragm motion. In order to build a general statistical model of diaphragm motion, we normalize the diaphragm motion in time and spatial domains and evaluate the diaphragm motion model of 10 healthy subjects by applying regular PCA and GND-PCA. We also validate the results using the leave-one-out method. The results show that the first three principal components of regular PCA contain more than 98% of the total variation of diaphragm motion. However, validation using leave-one-out method gives up to 5.0 mm mean of error for right diaphragm motion and 3.8 mm mean of error for left diaphragm motion. Model analysis using GND-PCA provides about 1 mm margin of error and is able to reconstruct the diaphragm model by fewer samples.

  12. Kinetic models for supercritical CO2 extraction of oilseeds - a review

    Directory of Open Access Journals (Sweden)

    B. Nagy

    2011-01-01

    Full Text Available The supercritical fluid extraction of oilseeds is gaining increasing interest in commercial application for the last few decades, most particularly thanks to technical and environmental advantages of supercritical fluid extraction technology compared to current extraction methods with organic solvents. Furthermore, CO2 as a solvent is generally recognized as safe (GRAS. At present moment, supercritical fluid extractions on a commercial scale are limited to decaffeination, production of soluble hops extracts, sesame seed oil production and extraction of certain petroleum products. When considering industrial application, it is essential to test the applicability of the appropriate model for supercritical fluid extraction of oilseeds used for scale up of laboratory data to industrial design purposes. The aim of this paper is to review the most significant kinetic models reported in the literature for supercritical fluid extraction.

  13. RED WINE EXTRACT OBTAINED BY MEMBRANE-BASED SUPERCRITICAL FLUID EXTRACTION: PRELIMINARY CHARACTERIZATION OF CHEMICAL PROPERTIES.

    Directory of Open Access Journals (Sweden)

    W. Silva

    Full Text Available ABSTRACT This study aims to obtain an extract from red wine by using membrane-based supercritical fluid extraction. This technique involves the use of porous membranes as contactors during the dense gas extraction process from liquid matrices. In this work, a Cabernet Sauvignon wine extract was obtained from supercritical fluid extraction using pressurized carbon dioxide as solvent and a hollow fiber contactor as extraction setup. The process was continuously conducted at pressures between 12 and 18 MPa and temperatures ranged from 30 to 50ºC. Meanwhile, flow rates of feed wine and supercritical CO2 varied from 0.1 to 0.5 mL min-1 and from 60 to 80 mL min-1 (NCPT, respectively. From extraction assays, the highest extraction percentage value obtained from the total amount of phenolic compounds was 14% in only one extraction step at 18MPa and 35ºC. A summarized chemical characterization of the obtained extract is reported in this work; one of the main compounds in this extract could be a low molecular weight organic acid with aromatic structure and methyl and carboxyl groups. Finally, this preliminary characterization of this extract shows a remarkable ORAC value equal to 101737 ± 5324 µmol Trolox equivalents (TE per 100 g of extract.

  14. Effect of organic bases on extraction of gadolinium carboxylates

    International Nuclear Information System (INIS)

    Sukhan, V.V.; Frankovskij, V.A.

    1982-01-01

    The effect of pyridine, 2-aminopyridine, benzylamine, antipyrine and o-phenanthroline on the extraction of capronates and bromocapronates of gadolinium with chloroform is studied. Out of the studied organic bases benzylamine produces the highest synergetic effect. In the absence of organic bases gadolinium carboxylates, solvated by three molecules of carbonic acids, are extracted into organic phase. A possihility of extractional separation of gadolinium from comparable amounts of iron with the mixture of 1 M solutions of caproic or bromocaproic acids with 1 M benzylamine from 0.1 M solution of tartaric acids is shown [ru

  15. Semi-Supervised Multi-View Ensemble Learning Based On Extracting Cross-View Correlation

    Directory of Open Access Journals (Sweden)

    ZALL, R.

    2016-05-01

    Full Text Available Correlated information between different views incorporate useful for learning in multi view data. Canonical correlation analysis (CCA plays important role to extract these information. However, CCA only extracts the correlated information between paired data and cannot preserve correlated information between within-class samples. In this paper, we propose a two-view semi-supervised learning method called semi-supervised random correlation ensemble base on spectral clustering (SS_RCE. SS_RCE uses a multi-view method based on spectral clustering which takes advantage of discriminative information in multiple views to estimate labeling information of unlabeled samples. In order to enhance discriminative power of CCA features, we incorporate the labeling information of both unlabeled and labeled samples into CCA. Then, we use random correlation between within-class samples from cross view to extract diverse correlated features for training component classifiers. Furthermore, we extend a general model namely SSMV_RCE to construct ensemble method to tackle semi-supervised learning in the presence of multiple views. Finally, we compare the proposed methods with existing multi-view feature extraction methods using multi-view semi-supervised ensembles. Experimental results on various multi-view data sets are presented to demonstrate the effectiveness of the proposed methods.

  16. Super-pixel extraction based on multi-channel pulse coupled neural network

    Science.gov (United States)

    Xu, GuangZhu; Hu, Song; Zhang, Liu; Zhao, JingJing; Fu, YunXia; Lei, BangJun

    2018-04-01

    Super-pixel extraction techniques group pixels to form over-segmented image blocks according to the similarity among pixels. Compared with the traditional pixel-based methods, the image descripting method based on super-pixel has advantages of less calculation, being easy to perceive, and has been widely used in image processing and computer vision applications. Pulse coupled neural network (PCNN) is a biologically inspired model, which stems from the phenomenon of synchronous pulse release in the visual cortex of cats. Each PCNN neuron can correspond to a pixel of an input image, and the dynamic firing pattern of each neuron contains both the pixel feature information and its context spatial structural information. In this paper, a new color super-pixel extraction algorithm based on multi-channel pulse coupled neural network (MPCNN) was proposed. The algorithm adopted the block dividing idea of SLIC algorithm, and the image was divided into blocks with same size first. Then, for each image block, the adjacent pixels of each seed with similar color were classified as a group, named a super-pixel. At last, post-processing was adopted for those pixels or pixel blocks which had not been grouped. Experiments show that the proposed method can adjust the number of superpixel and segmentation precision by setting parameters, and has good potential for super-pixel extraction.

  17. Modeling In-stream Tidal Energy Extraction and Its Potential Environmental Impacts

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhaoqing; Wang, Taiping; Copping, Andrea; Geerlofs, Simon H.

    2014-09-30

    In recent years, there has been growing interest in harnessing in-stream tidal energy in response to concerns of increasing energy demand and to mitigate climate change impacts. While many studies have been conducted to assess and map tidal energy resources, efforts for quantifying the associated potential environmental impacts have been limited. This paper presents the development of a tidal turbine module within a three-dimensional unstructured-grid coastal ocean model and its application for assessing the potential environmental impacts associated with tidal energy extraction. The model is used to investigate in-stream tidal energy extraction and associated impacts on estuarine hydrodynamic and biological processes in a tidally dominant estuary. A series of numerical experiments with varying numbers and configurations of turbines installed in an idealized estuary were carried out to assess the changes in the hydrodynamics and biological processes due to tidal energy extraction. Model results indicated that a large number of turbines are required to extract the maximum tidal energy and cause significant reduction of the volume flux. Preliminary model results also indicate that extraction of tidal energy increases vertical mixing and decreases flushing rate in a stratified estuary. The tidal turbine model was applied to simulate tidal energy extraction in Puget Sound, a large fjord-like estuary in the Pacific Northwest coast.

  18. Orthogonal search-based rule extraction for modelling the decision to transfuse.

    Science.gov (United States)

    Etchells, T A; Harrison, M J

    2006-04-01

    Data from an audit relating to transfusion decisions during intermediate or major surgery were analysed to determine the strengths of certain factors in the decision making process. The analysis, using orthogonal search-based rule extraction (OSRE) from a trained neural network, demonstrated that the risk of tissue hypoxia (ROTH) assessed using a 100-mm visual analogue scale, the haemoglobin value (Hb) and the presence or absence of on-going haemorrhage (OGH) were able to reproduce the transfusion decisions with a joint specificity of 0.96 and sensitivity of 0.93 and a positive predictive value of 0.9. The rules indicating transfusion were: 1. ROTH > 32 mm and Hb 13 mm and Hb 38 mm, Hb < 102 g x l(-1) and OGH; 4. Hb < 78 g x l(-1).

  19. Image segmentation-based robust feature extraction for color image watermarking

    Science.gov (United States)

    Li, Mianjie; Deng, Zeyu; Yuan, Xiaochen

    2018-04-01

    This paper proposes a local digital image watermarking method based on Robust Feature Extraction. The segmentation is achieved by Simple Linear Iterative Clustering (SLIC) based on which an Image Segmentation-based Robust Feature Extraction (ISRFE) method is proposed for feature extraction. Our method can adaptively extract feature regions from the blocks segmented by SLIC. This novel method can extract the most robust feature region in every segmented image. Each feature region is decomposed into low-frequency domain and high-frequency domain by Discrete Cosine Transform (DCT). Watermark images are then embedded into the coefficients in the low-frequency domain. The Distortion-Compensated Dither Modulation (DC-DM) algorithm is chosen as the quantization method for embedding. The experimental results indicate that the method has good performance under various attacks. Furthermore, the proposed method can obtain a trade-off between high robustness and good image quality.

  20. The fitting parameters extraction of conversion model of the low dose rate effect in bipolar devices

    International Nuclear Information System (INIS)

    Bakerenkov, Alexander

    2011-01-01

    The Enhanced Low Dose Rate Sensitivity (ELDRS) in bipolar devices consists of in base current degradation of NPN and PNP transistors increase as the dose rate is decreased. As a result of almost 20-year studying, the some physical models of effect are developed, being described in detail. Accelerated test methods, based on these models use in standards. The conversion model of the effect, that allows to describe the inverse S-shaped excess base current dependence versus dose rate, was proposed. This paper presents the problem of conversion model fitting parameters extraction.

  1. Single- and two-phase flow simulation based on equivalent pore network extracted from micro-CT images of sandstone core.

    Science.gov (United States)

    Song, Rui; Liu, Jianjun; Cui, Mengmeng

    2016-01-01

    Due to the intricate structure of porous rocks, relationships between porosity or saturation and petrophysical transport properties classically used for reservoir evaluation and recovery strategies are either very complex or nonexistent. Thus, the pore network model extracted from the natural porous media is emphasized as a breakthrough to predict the fluid transport properties in the complex micro pore structure. This paper presents a modified method of extracting the equivalent pore network model from the three-dimensional micro computed tomography images based on the maximum ball algorithm. The partition of pore and throat are improved to avoid tremendous memory usage when extracting the equivalent pore network model. The porosity calculated by the extracted pore network model agrees well with the original sandstone sample. Instead of the Poiseuille's law used in the original work, the Lattice-Boltzmann method is employed to simulate the single- and two- phase flow in the extracted pore network. Good agreements are acquired on relative permeability saturation curves of the simulation against the experiment results.

  2. Integrated photooxidative extractive deep desulfurization using metal doped TiO2 and eutectic based ionic liquid

    Science.gov (United States)

    Zaid, Hayyiratul Fatimah Mohd; Kait, Chong Fai; Mutalib, Mohamed Ibrahim Abdul

    2016-11-01

    A series of metal doped TiO2 namely Fe/TiO2, Cu/TiO2 and Cu-Fe/TiO2 were synthesized and characterized, to be used as a photocatalyst in the integrated photooxidative extractive deep desulfurization for model oil (dodecane) and diesel fuel. The order of the photocatalytic activity was Cu-Fe/TiO2 followed by Cu/TiO2 and then Fe/TiO2. Cu-Fe/TiO2 was an effective photocatalyst for sulfur conversion at ambient atmospheric pressure. Hydrogen peroxide was used as the source of oxidant and eutectic-based ionic liquid as the extractant. Sulfur conversion in model oil reached 100%. Removal of sulfur from model oil was done by two times extraction with a removal of 97.06% in the first run and 2.94% in the second run.

  3. A new GIS-based model for automated extraction of Sand Dune encroachment case study: Dakhla Oases, western desert of Egypt

    Directory of Open Access Journals (Sweden)

    M. Ghadiry

    2012-06-01

    Full Text Available The movements of the sand dunes are considered as a threat for roads, irrigation networks, water resources, urban areas, agriculture and infrastructures. The main objectives of this study are to develop a new GIS-based model for automated extraction of sand dune encroachment using remote sensing data and to assess the rate of sand dune movement. To monitor and assess the movements of sand dunes in Dakhla oases area, multi-temporal satellite images and a GIS-developed model, using Python script in Arc GIS, were used. The satellite images (SPOT images, 1995 and 2007 were geo-rectified using Erdas Imagine. Image subtraction was performed using spatial analyst in Arc GIS, the result of image subtraction obtains the sand dune movement between the two dates. The raster and vector shape of sand dune migration was automatically extracted using spatial analyst tools. The frontiers of individual dunes were measured at different dates and movement rates were analyzed in GIS. The ModelBuilder in Arc GIS was used in order to create a user friendly tool. The custom built model window is easy to handle by any user who wishes to adapt the model in his work. It was found that the rate of sand dune movement ranged between 3 and 9 m per year. The majority of sand dunes have a rate movement between 0 and 6 m and very few dunes had a movement rate between 6 and 9 m. Integrating remote sensing and GIS provided the necessary information for determining the minimum, maximum, mean, rate and area of sand dune migration.

  4. Modeling of fermentative hydrogen production from sweet sorghum extract based on modified ADM1

    DEFF Research Database (Denmark)

    Antonopoulou, Georgia; Gavala, Hariklia N.; Skiadas, Ioannis

    2012-01-01

    The Anaerobic digestion model 1 (ADM1) framework can be used to predict fermentative hydrogen production, since the latter is directly related to the acidogenic stage of the anaerobic digestion process. In this study, the ADM1 model framework was used to simulate and predict the process...... used for kinetic parameter validation. Since the ADM1 does not account for metabolic products such as lactic acid and ethanol that are crucial during the fermentative hydrogen production process, the structure of the model was modified to include lactate and ethanol among the metabolites and to improve...... of fermentative hydrogen production from the extractable sugars of sweet sorghum biomass. Kinetic parameters for sugars’ consumption and yield coefficients of acetic, propionic and butyric acid production were estimated using the experimental data obtained from the steady states of a CSTR. Batch experiments were...

  5. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  6. Subject-based feature extraction by using fisher WPD-CSP in brain-computer interfaces.

    Science.gov (United States)

    Yang, Banghua; Li, Huarong; Wang, Qian; Zhang, Yunyuan

    2016-06-01

    Feature extraction of electroencephalogram (EEG) plays a vital role in brain-computer interfaces (BCIs). In recent years, common spatial pattern (CSP) has been proven to be an effective feature extraction method. However, the traditional CSP has disadvantages of requiring a lot of input channels and the lack of frequency information. In order to remedy the defects of CSP, wavelet packet decomposition (WPD) and CSP are combined to extract effective features. But WPD-CSP method considers less about extracting specific features that are fitted for the specific subject. So a subject-based feature extraction method using fisher WPD-CSP is proposed in this paper. The idea of proposed method is to adapt fisher WPD-CSP to each subject separately. It mainly includes the following six steps: (1) original EEG signals from all channels are decomposed into a series of sub-bands using WPD; (2) average power values of obtained sub-bands are computed; (3) the specified sub-bands with larger values of fisher distance according to average power are selected for that particular subject; (4) each selected sub-band is reconstructed to be regarded as a new EEG channel; (5) all new EEG channels are used as input of the CSP and a six-dimensional feature vector is obtained by the CSP. The subject-based feature extraction model is so formed; (6) the probabilistic neural network (PNN) is used as the classifier and the classification accuracy is obtained. Data from six subjects are processed by the subject-based fisher WPD-CSP, the non-subject-based fisher WPD-CSP and WPD-CSP, respectively. Compared with non-subject-based fisher WPD-CSP and WPD-CSP, the results show that the proposed method yields better performance (sensitivity: 88.7±0.9%, and specificity: 91±1%) and the classification accuracy from subject-based fisher WPD-CSP is increased by 6-12% and 14%, respectively. The proposed subject-based fisher WPD-CSP method can not only remedy disadvantages of CSP by WPD but also discriminate

  7. Comparing deep learning and concept extraction based methods for patient phenotyping from clinical narratives.

    Science.gov (United States)

    Gehrmann, Sebastian; Dernoncourt, Franck; Li, Yeran; Carlson, Eric T; Wu, Joy T; Welt, Jonathan; Foote, John; Moseley, Edward T; Grant, David W; Tyler, Patrick D; Celi, Leo A

    2018-01-01

    In secondary analysis of electronic health records, a crucial task consists in correctly identifying the patient cohort under investigation. In many cases, the most valuable and relevant information for an accurate classification of medical conditions exist only in clinical narratives. Therefore, it is necessary to use natural language processing (NLP) techniques to extract and evaluate these narratives. The most commonly used approach to this problem relies on extracting a number of clinician-defined medical concepts from text and using machine learning techniques to identify whether a particular patient has a certain condition. However, recent advances in deep learning and NLP enable models to learn a rich representation of (medical) language. Convolutional neural networks (CNN) for text classification can augment the existing techniques by leveraging the representation of language to learn which phrases in a text are relevant for a given medical condition. In this work, we compare concept extraction based methods with CNNs and other commonly used models in NLP in ten phenotyping tasks using 1,610 discharge summaries from the MIMIC-III database. We show that CNNs outperform concept extraction based methods in almost all of the tasks, with an improvement in F1-score of up to 26 and up to 7 percentage points in area under the ROC curve (AUC). We additionally assess the interpretability of both approaches by presenting and evaluating methods that calculate and extract the most salient phrases for a prediction. The results indicate that CNNs are a valid alternative to existing approaches in patient phenotyping and cohort identification, and should be further investigated. Moreover, the deep learning approach presented in this paper can be used to assist clinicians during chart review or support the extraction of billing codes from text by identifying and highlighting relevant phrases for various medical conditions.

  8. Process analysis and modeling of a single-step lutein extraction method for wet microalgae.

    Science.gov (United States)

    Gong, Mengyue; Wang, Yuruihan; Bassi, Amarjeet

    2017-11-01

    Lutein is a commercial carotenoid with potential health benefits. Microalgae are alternative sources for the lutein production in comparison to conventional approaches using marigold flowers. In this study, a process analysis of a single-step simultaneous extraction, saponification, and primary purification process for free lutein production from wet microalgae biomass was carried out. The feasibility of binary solvent mixtures for wet biomass extraction was successfully demonstrated, and the extraction kinetics of lutein from chloroplast in microalgae were first evaluated. The effects of types of organic solvent, solvent polarity, cell disruption method, and alkali and solvent usage on lutein yields were examined. A mathematical model based on Fick's second law of diffusion was applied to model the experimental data. The mass transfer coefficients were used to estimate the extraction rates. The extraction rate was found more significantly related with alkali ratio to solvent than to biomass. The best conditions for extraction efficiency were found to be pre-treatment with ultrasonication at 0.5 s working cycle per second, react 0.5 h in 0.27 L/g solvent to biomass ratio, and 1:3 ether/ethanol (v/v) with 1.25 g KOH/L. The entire process can be controlled within 1 h and yield over 8 mg/g lutein, which is more economical for scale-up.

  9. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai [Bioinformatics and Molecular Design Research Center, Seoul (Korea, Republic of); Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun [Yonsei University, Seoul (Korea, Republic of); Chu, Young Hwan [Sangji University, Wonju (Korea, Republic of); Cho, Kwang-Hwi [Soongsil University, Seoul (Korea, Republic of)

    2016-04-15

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  10. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    International Nuclear Information System (INIS)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai; Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun; Chu, Young Hwan; Cho, Kwang-Hwi

    2016-01-01

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  11. Cryptographic Protocols Based on Root Extracting

    DEFF Research Database (Denmark)

    Koprowski, Maciej

    In this thesis we design new cryptographic protocols, whose security is based on the hardness of root extracting or more speci cally the RSA problem. First we study the problem of root extraction in nite Abelian groups, where the group order is unknown. This is a natural generalization of the...... complexity of root extraction, even if the algorithm can choose the "public exponent'' itself. In other words, both the standard and the strong RSA assumption are provably true w.r.t. generic algorithms. The results hold for arbitrary groups, so security w.r.t. generic attacks follows for any cryptographic...... groups. In all cases, security follows from a well de ned complexity assumption (the strong root assumption), without relying on random oracles. A smooth natural number has no big prime factors. The probability, that a random natural number not greater than x has all prime factors smaller than x1/u...

  12. Mathematical modelling of zirconium salicylate solvent extraction process

    International Nuclear Information System (INIS)

    Smirnova, N.S.; Evseev, A.M.; Fadeeva, V.I.; Kochetkova, S.K.

    1979-01-01

    Mathematical modelling of equilibrium multicomponent physicochemical system at the extraction of zirconium salicylates by chloroform is carried out from HCl aqueous solutions at pH 0.5-4.7. Adequate models, comprising different molecular forms, corresponding to equilibrium phase composition are built

  13. Mathematical modelling of zirconium salicylate solvent extraction process

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, N S; Evseev, A M; Fadeeva, V I; Kochetkova, S K [Moskovskij Gosudarstvennyj Univ. (USSR)

    1979-11-01

    Mathematical modelling of equilibrium multicomponent physicochemical system at the extraction of zirconium salicylates by chloroform is carried out from HCl aqueous solutions at pH 0.5-4.7. Adequate models, comprising different molecular forms, corresponding to equilibrium phase composition are built.

  14. Three-Dimensional Precession Feature Extraction of Ballistic Targets Based on Narrowband Radar Network

    Directory of Open Access Journals (Sweden)

    Zhao Shuang

    2017-02-01

    Full Text Available Micro-motion is a crucial feature used in ballistic target recognition. To address the problem that single-view observations cannot extract true micro-motion parameters, we propose a novel algorithm based on the narrowband radar network to extract three-dimensional precession features. First, we construct a precession model of the cone-shaped target, and as a precondition, we consider the invisible problem of scattering centers. We then analyze in detail the micro-Doppler modulation trait caused by the precession. Then, we match each scattering center in different perspectives based on the ratio of the top scattering center’s micro-Doppler frequency modulation coefficient and extract the 3D coning vector of the target by establishing associated multi-aspect equation systems. In addition, we estimate feature parameters by utilizing the correlation of the micro-Doppler frequency modulation coefficient of the three scattering centers combined with the frequency compensation method. We then calculate the coordinates of the conical point in each moment and reconstruct the 3D spatial portion. Finally, we provide simulation results to validate the proposed algorithm.

  15. Application of composite materials based on various extractants for isolation of lanthanides(III) nitrates from multicomponent aqueous solutions

    International Nuclear Information System (INIS)

    Kopyrin, A.A.; Pyartman, A.K.; Kesnikov, V.A.; Pleshkov, M.A.; Exekov, M.H.

    1999-01-01

    In present work we obtained samples of composite materials mentioned containing tributylphosphate (TBP) and trialkylmethylammonium nitrate (TAMAN). Extraction of lanthanides(III) nitrates of cerium group from multicomponent aqueous solutions by means of these materials was studied. Some systems with different concentration of sodium nitrate up to 5 mol/l and the same systems containing additions of sodium chloride or sulfate along with sodium nitrate was investigated, isotherm of extraction being obtained for all cases. Also we compared in identical conditions extraction process when liquid extractants were used and process with composite materials. It was found that traditional extraction systems and systems based on composite extractants demonstrated almost the same extraction properties in respect to lanthanides(III) nitrates. Extraction isotherms observed in identical conditions and being shown in the same coordinates had no difference with taking into account errors of experiment. This fact allow to use the same mathematical model for those systems. For systems studied it was generated mathematical model that is able to describe extraction process when component concentration vary in wide range, with assumption being used that ratio activity coefficients in organic phase stay constant. (authors)

  16. Noninvasive extraction of fetal electrocardiogram based on Support Vector Machine

    Science.gov (United States)

    Fu, Yumei; Xiang, Shihan; Chen, Tianyi; Zhou, Ping; Huang, Weiyan

    2015-10-01

    The fetal electrocardiogram (FECG) signal has important clinical value for diagnosing the fetal heart diseases and choosing suitable therapeutics schemes to doctors. So, the noninvasive extraction of FECG from electrocardiogram (ECG) signals becomes a hot research point. A new method, the Support Vector Machine (SVM) is utilized for the extraction of FECG with limited size of data. Firstly, the theory of the SVM and the principle of the extraction based on the SVM are studied. Secondly, the transformation of maternal electrocardiogram (MECG) component in abdominal composite signal is verified to be nonlinear and fitted with the SVM. Then, the SVM is trained, and the training results are compared with the real data to ensure the effect of the training. Meanwhile, the parameters of the SVM are optimized to achieve the best performance so that the learning machine can be utilized to fit the unknown samples. Finally, the FECG is extracted by removing the optimal estimation of MECG component from the abdominal composite signal. In order to evaluate the performance of FECG extraction based on the SVM, the Signal-to-Noise Ratio (SNR) and the visual test are used. The experimental results show that the FECG with good quality can be extracted, its SNR ratio is significantly increased as high as 9.2349 dB and the time cost is significantly decreased as short as 0.802 seconds. Compared with the traditional method, the noninvasive extraction method based on the SVM has a simple realization, the shorter treatment time and the better extraction quality under the same conditions.

  17. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    Directory of Open Access Journals (Sweden)

    Yang Li

    Full Text Available In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA methods, a new rule extraction method based on extreme learning machine (ELM and an improved Ant-miner (IAM algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

  18. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    Science.gov (United States)

    Li, Yang; Li, Guoqing; Wang, Zhenhao

    2015-01-01

    In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA) methods, a new rule extraction method based on extreme learning machine (ELM) and an improved Ant-miner (IAM) algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

  19. Artificial neural network for modeling the extraction of aromatic hydrocarbons from lube oil cuts

    Energy Technology Data Exchange (ETDEWEB)

    Mehrkesh, A.H.; Hajimirzaee, S. [Islamic Azad University, Majlesi Branch, Isfahan (Iran, Islamic Republic of); Hatamipour, M.S.; Tavakoli, T. [Department of Chemical Engineering, University of Isfahan, Isfahan (Iran, Islamic Republic of)

    2011-03-15

    An artificial neural network (ANN) approach was used to obtain a simulation model to predict the rotating disc contactor (RDC) performance during the extraction of aromatic hydrocarbons from lube oil cuts, to produce a lubricating base oil using furfural as solvent. The field data used for training the ANN model was obtained from a lubricating oil production company. The input parameters of the ANN model were the volumetric flow rates of feed and solvent, the temperatures of feed and solvent, and the disc rotation rate. The output parameters were the volumetric flow rate of the raffinate phase and the extraction yield. In this study, a feed-forward multi-layer perceptron neural network was successfully used to demonstrate the complex relationship between the mentioned input and output parameters. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  20. [Study on moisture sorption process model and application traditional Chinese medicine extract powder].

    Science.gov (United States)

    Lin, Tingting; He, Yan; Xiao, Xiong; Yuan, Liang; Rao, Xiaoyong; Luo, Xiaojian

    2010-04-01

    Study on the moisture sorption process characteristics of traditional Chinese medicine extract powder, to establish a mathematical model, provide a new method for in-depth study for moisture sorption behavior of traditional Chinese medicine extract powder and a reference for determine the production cycle, and predict product stability. Analyzed moisture absorption process of traditional Chinese medicine extract powder by utilized the law of conservation of mass and Fick's first law to establish the double exponential absorption model, fitted the moisture absorption data and compared with other commonly used five kinds of model to estimate the double-exponential absorption model. The statistical analysis showed that the coefficient of determination (R2) of double exponential model, Weibull distribution model and first order kinetics model were large, but the residues sum of squares (RSS) and AIC values were small. Synthesized the practical application meaning, we consided that the double exponential model was more suitable for simulating the process of Chinese medicine extract powder moisture absorption. The double exponential is suitable for characterization the process of traditional Chinese medicine extract moisture absorption.

  1. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    OpenAIRE

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S.; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality i...

  2. Geochemical Modeling of ILAW Lysimeter Water Extracts

    Energy Technology Data Exchange (ETDEWEB)

    Cantrell, Kirk J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-22

    Geochemical modeling results of water extracts from simulated immobilized low-activity waste (ILAW) glasses, placed in lysimeters for eight years suggest that the secondary phase reaction network developed using product consistency test (PCT) results at 90°C may need to be modified for field conditions. For sediment samples that had been collected from near the glass samples, the impact of glass corrosion could be readily observed based upon the pH of their water extracts. For unimpacted sediments the pH ranged from 7.88 to 8.11 with an average of 8.04. Sediments that had observable impacts from glass corrosion exhibited elevated pH values (as high as 9.97). For lysimeter sediment samples that appear to have been impacted by glass corrosion to the greatest extent, saturation indices determined for analcime, calcite, and chalcedony in the 1:1 water extracts were near equilibrium and were consistent with the secondary phase reaction network developed using PCT results at 90°C. Fe(OH)3(s) also appears to be essentially at equilibrium in extracts impacted by glass corrosion, but with a solubility product (log Ksp) that is approximately 2.13 units lower than that used in the secondary phase reaction network developed using PCT results at 90°C. The solubilities of TiO2(am) and ZrO2(am) also appear to be much lower than that assumed in the secondary phase reaction network developed using PCT results at 90°C. The extent that the solubility of TiO2(am) and ZrO2(am) were reduced relative to that assumed in the secondary phase reaction network developed using PCT results at 90°C could not be quantified because the concentrations of Ti and Zr in the extracts were below the estimated quantification limit. Gibbsite was consistently highly oversaturated in the extract while dawsonite was at or near equilibrium. This suggests that dawsonite might be a more suitable phase for the secondary phase reaction network

  3. Feature-Based and String-Based Models for Predicting RNA-Protein Interaction

    Directory of Open Access Journals (Sweden)

    Donald Adjeroh

    2018-03-01

    Full Text Available In this work, we study two approaches for the problem of RNA-Protein Interaction (RPI. In the first approach, we use a feature-based technique by combining extracted features from both sequences and secondary structures. The feature-based approach enhanced the prediction accuracy as it included much more available information about the RNA-protein pairs. In the second approach, we apply search algorithms and data structures to extract effective string patterns for prediction of RPI, using both sequence information (protein and RNA sequences, and structure information (protein and RNA secondary structures. This led to different string-based models for predicting interacting RNA-protein pairs. We show results that demonstrate the effectiveness of the proposed approaches, including comparative results against leading state-of-the-art methods.

  4. Vision-based weld pool boundary extraction and width measurement during keyhole fiber laser welding

    Science.gov (United States)

    Luo, Masiyang; Shin, Yung C.

    2015-01-01

    In keyhole fiber laser welding processes, the weld pool behavior is essential to determining welding quality. To better observe and control the welding process, the accurate extraction of the weld pool boundary as well as the width is required. This work presents a weld pool edge detection technique based on an off axial green illumination laser and a coaxial image capturing system that consists of a CMOS camera and optic filters. According to the difference of image quality, a complete developed edge detection algorithm is proposed based on the local maximum gradient of greyness searching approach and linear interpolation. The extracted weld pool geometry and the width are validated by the actual welding width measurement and predictions by a numerical multi-phase model.

  5. Modeling of a Stacked Power Module for Parasitic Inductance Extraction

    Science.gov (United States)

    2017-09-15

    ARL-TR-8138 ● SEP 2017 US Army Research Laboratory Modeling of a Stacked Power Module for Parasitic Inductance Extraction by...not return it to the originator. ARL-TR-8138 ● SEP 2017 US Army Research Laboratory Modeling of a Stacked Power Module for... Power Module for Parasitic Inductance Extraction 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Steven Kaplan

  6. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  7. Mobility of radionuclides based on sequential extraction of soils

    International Nuclear Information System (INIS)

    Salbu, B.; Oughton, D.H.; Lien, H.N.; Oestby, G.; Strand, P.

    1992-01-01

    Since 1989, core samples of soil and vegetation from semi-natural pastures have been collected at selected sites in Norway during the growing season. The activity concentrations in soil and vegetation as well as transfer coefficients vary significantly between regions, within regions and even within sampling plot areas. In order to differentiate between mobil and inert fractions of radioactive and stable isotopes of Cs and Sr in soils, samples were extracted sequentially using agents with increasing dissolution power. The reproducibility of the sequential extraction technique is good and the data obtained seems most informative. As the distribution pattern for radioactive and stable isotopes of Cs and Sr are similar, a high degree of isotopic exchange is indicated. Based on easily leachable fractions, mobility factors are calculated. In general the mobility of 90 Sr is higher than for 137 Cs. Mobility factors are not significantly influenced by seasonal variations, but a decrease in the mobile fraction in soil with time is indicated. Mobility factors should be considered useful for modelling purposes. (au)

  8. Remaining useful life estimation based on discriminating shapelet extraction

    International Nuclear Information System (INIS)

    Malinowski, Simon; Chebel-Morello, Brigitte; Zerhouni, Noureddine

    2015-01-01

    In the Prognostics and Health Management domain, estimating the remaining useful life (RUL) of critical machinery is a challenging task. Various research topics including data acquisition, fusion, diagnostics and prognostics are involved in this domain. This paper presents an approach, based on shapelet extraction, to estimate the RUL of equipment. This approach extracts, in an offline step, discriminative rul-shapelets from an history of run-to-failure data. These rul-shapelets are patterns that are selected for their correlation with the remaining useful life of the equipment. In other words, every selected rul-shapelet conveys its own information about the RUL of the equipment. In an online step, these rul-shapelets are compared to testing units and the ones that match these units are used to estimate their RULs. Therefore, RUL estimation is based on patterns that have been selected for their high correlation with the RUL. This approach is different from classical similarity-based approaches that attempt to match complete testing units (or only late instants of testing units) with training ones to estimate the RUL. The performance of our approach is evaluated on a case study on the remaining useful life estimation of turbofan engines and performance is compared with other similarity-based approaches. - Highlights: • A data-driven RUL estimation technique based on pattern extraction is proposed. • Patterns are extracted for their correlation with the RUL. • The proposed method shows good performance compared to other techniques

  9. Multi-criteria optimization for ultrasonic-assisted extraction of antioxidants from Pericarpium Citri Reticulatae using response surface methodology, an activity-based approach.

    Science.gov (United States)

    Zeng, Shanshan; Wang, Lu; Zhang, Lei; Qu, Haibin; Gong, Xingchu

    2013-06-01

    An activity-based approach to optimize the ultrasonic-assisted extraction of antioxidants from Pericarpium Citri Reticulatae (Chenpi in Chinese) was developed. Response surface optimization based on a quantitative composition-activity relationship model showed the relationships among product chemical composition, antioxidant activity of extract, and parameters of extraction process. Three parameters of ultrasonic-assisted extraction, including the ethanol/water ratio, Chenpi amount, and alkaline amount, were investigated to give optimum extraction conditions for antioxidants of Chenpi: ethanol/water 70:30 v/v, Chenpi amount of 10 g, and alkaline amount of 28 mg. The experimental antioxidant yield under the optimum conditions was found to be 196.5 mg/g Chenpi, and the antioxidant activity was 2023.8 μmol Trolox equivalents/g of the Chenpi powder. The results agreed well with the second-order polynomial regression model. This presented approach promised great application potentials in both food and pharmaceutical industries. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Towards a more appropriate water based extraction for the assessment of organic contaminant availability

    International Nuclear Information System (INIS)

    Hickman, Zachary A.; Reid, Brian J.

    2005-01-01

    This study correlated extractabilities of 37 d aged phenanthrene residues in four dissimilar soils with the fraction that was available for earthworm (Lumbricus rubellus) accumulation and microorganism (Pseudomonas sp.) mineralisation. Extractability was determined using two established techniques, namely (1) a water based extraction using CO 2 equilibrated water and (2) an aqueous based hydroxypropyl-β-cyclodextrin (HPCD) extraction. Results showed no relationship between earthworm accumulation and phenanthrene extractability using either HPCD (r 2 =0.07; slope=-4.76; n=5) or the water based extraction (r 2 =0.31; slope=-5.34; n=5). Earthworm accumulation was overestimated by both techniques. In contrast, the fraction of phenanthrene extractable using both the HPCD technique and the water based extraction correlated strongly with microbial mineralisation. However, the slopes of these linear relationships were 0.48 (r 2 =0.96; n=10), and 0.99 (r 2 =0.88; n=10) for the water based extraction and HPCD, respectively. Thus, the HPCD extraction provided values that were numerically close to the mineralisation values, whilst the water based extraction values were approximately half the mineralisation values. It is submitted that HPCD extraction provided an appropriate method of assessing the fraction of contaminant available for microbial mineralisation in these dissimilar soils. - No significant difference was found between microbially mineralised phenanthrene and extractability using hydroxypropyl-β-cyclodextrin in four dissimilar soils; the water-only extraction removed half of this fraction

  11. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    Science.gov (United States)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  12. Actant model of an extraction plant

    Energy Technology Data Exchange (ETDEWEB)

    Poulsen, Helle

    1999-05-01

    Facing a growing complexity of industrial plants, we recognise the need for qualitative modelling methods capturing functional and causal complexity in a human-centred way. The present paper presents actant modelling as a functional modelling method rooted in linguistics and semiotics. Actant modelling combines actant models from linguistics with multilevel flow modelling (MFM). Thus the semantics of MFM functions is developed further and given an interpretation in terms of actant functions. The present challenge is to provide coherence between seemingly different categories of knowledge. Yet the gap between functional and causal modelling methods can be bridged. Actant modelling provides an open and provisional, but in no way exhaustive or final answer as to how teleological concepts like goals and functions relate to causal concepts. As the main focus of the paper an actant model of an extraction plant is presented. It is shown how the actant model merges functional and causal knowledge in a natural way.

  13. Actant model of an extraction plant

    International Nuclear Information System (INIS)

    Poulsen, Helle

    1999-01-01

    Facing a growing complexity of industrial plants, we recognise the need for qualitative modelling methods capturing functional and causal complexity in a human-centred way. The present paper presents actant modelling as a functional modelling method rooted in linguistics and semiotics. Actant modelling combines actant models from linguistics with multilevel flow modelling (MFM). Thus the semantics of MFM functions is developed further and given an interpretation in terms of actant functions. The present challenge is to provide coherence between seemingly different categories of knowledge. Yet the gap between functional and causal modelling methods can be bridged. Actant modelling provides an open and provisional, but in no way exhaustive or final answer as to how teleological concepts like goals and functions relate to causal concepts. As the main focus of the paper an actant model of an extraction plant is presented. It is shown how the actant model merges functional and causal knowledge in a natural way

  14. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  15. Chemical name extraction based on automatic training data generation and rich feature set.

    Science.gov (United States)

    Yan, Su; Spangler, W Scott; Chen, Ying

    2013-01-01

    The automation of extracting chemical names from text has significant value to biomedical and life science research. A major barrier in this task is the difficulty of getting a sizable and good quality data to train a reliable entity extraction model. Another difficulty is the selection of informative features of chemical names, since comprehensive domain knowledge on chemistry nomenclature is required. Leveraging random text generation techniques, we explore the idea of automatically creating training sets for the task of chemical name extraction. Assuming the availability of an incomplete list of chemical names, called a dictionary, we are able to generate well-controlled, random, yet realistic chemical-like training documents. We statistically analyze the construction of chemical names based on the incomplete dictionary, and propose a series of new features, without relying on any domain knowledge. Compared to state-of-the-art models learned from manually labeled data and domain knowledge, our solution shows better or comparable results in annotating real-world data with less human effort. Moreover, we report an interesting observation about the language for chemical names. That is, both the structural and semantic components of chemical names follow a Zipfian distribution, which resembles many natural languages.

  16. Fusion of Pixel-based and Object-based Features for Road Centerline Extraction from High-resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    CAO Yungang

    2016-10-01

    Full Text Available A novel approach for road centerline extraction from high spatial resolution satellite imagery is proposed by fusing both pixel-based and object-based features. Firstly, texture and shape features are extracted at the pixel level, and spectral features are extracted at the object level based on multi-scale image segmentation maps. Then, extracted multiple features are utilized in the fusion framework of Dempster-Shafer evidence theory to roughly identify the road network regions. Finally, an automatic noise removing algorithm combined with the tensor voting strategy is presented to accurately extract the road centerline. Experimental results using high-resolution satellite imageries with different scenes and spatial resolutions showed that the proposed approach compared favorably with the traditional methods, particularly in the aspect of eliminating the salt noise and conglutination phenomenon.

  17. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    Narayanan, Nithin; Bloomsburgh, John; He Yie; Mao Jianhua; Patil, Mahesh B; Akkaraju, Sandeep

    2006-01-01

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  18. Simulation of flux during electro-membrane extraction based on the Nernst-Planck equation.

    Science.gov (United States)

    Gjelstad, Astrid; Rasmussen, Knut Einar; Pedersen-Bjergaard, Stig

    2007-12-07

    The present work has for the first time described and verified a theoretical model of the analytical extraction process electro-membrane extraction (EME), where target analytes are extracted from an aqueous sample, through a thin layer of 2-nitrophenyl octylether immobilized as a supported liquid membrane (SLM) in the pores in the wall of a porous hollow fibre, and into an acceptor solution present inside the lumen of the hollow fibre by the application of an electrical potential difference. The mathematical model was based on the Nernst-Planck equation, and described the flux over the SLM. The model demonstrated that the magnitude of the electrical potential difference, the ion balance of the system, and the absolute temperature influenced the flux of analyte across the SLM. These conclusions were verified by experimental data with five basic drugs. The flux was strongly dependent of the potential difference over the SLM, and increased potential difference resulted in an increase in the flux. The ion balance, defined as the sum of ions in the donor solution divided by the sum of ions in the acceptor solution, was shown to influence the flux, and high ionic concentration in the acceptor solution relative to the sample solution was advantageous for high flux. Different temperatures also led to changes in the flux in the EME system.

  19. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  20. The Surface Extraction from TIN based Search-space Minimization (SETSM) algorithm

    Science.gov (United States)

    Noh, Myoung-Jong; Howat, Ian M.

    2017-07-01

    Digital Elevation Models (DEMs) provide critical information for a wide range of scientific, navigational and engineering activities. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible for generating stereo-photogrammetric DEMs. However, low contrast and repeatedly-textured surfaces, such as snow and glacial ice at high latitudes, and mountainous terrains challenge existing stereo-photogrammetric DEM generation techniques, particularly without a-priori information such as existing seed DEMs or the manual setting of terrain-specific parameters. To utilize these data for fully-automatic DEM extraction at a large scale, we developed the Surface Extraction from TIN-based Search-space Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the sensor model Rational Polynomial Coefficients (RPCs). SETSM adopts a hierarchical, combined image- and object-space matching strategy utilizing weighted normalized cross-correlation with both original distorted and geometrically corrected images for overcoming ambiguities caused by foreshortening and occlusions. In addition, SETSM optimally minimizes search-spaces to extract optimal matches over problematic terrains by iteratively updating object surfaces within a Triangulated Irregular Network, and utilizes a geometric-constrained blunder and outlier detection in object space. We prove the ability of SETSM to mitigate typical stereo-photogrammetric matching problems over a range of challenging terrains. SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM project.

  1. Airport object extraction based on visual attention mechanism and parallel line detection

    Science.gov (United States)

    Lv, Jing; Lv, Wen; Zhang, Libao

    2017-10-01

    Target extraction is one of the important aspects in remote sensing image analysis and processing, which has wide applications in images compression, target tracking, target recognition and change detection. Among different targets, airport has attracted more and more attention due to its significance in military and civilian. In this paper, we propose a novel and reliable airport object extraction model combining visual attention mechanism and parallel line detection algorithm. First, a novel saliency analysis model for remote sensing images with airport region is proposed to complete statistical saliency feature analysis. The proposed model can precisely extract the most salient region and preferably suppress the background interference. Then, the prior geometric knowledge is analyzed and airport runways contained two parallel lines with similar length are detected efficiently. Finally, we use the improved Otsu threshold segmentation method to segment and extract the airport regions from the salient map of remote sensing images. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the detection of the airport.

  2. Selective electromembrane extraction at low voltages based on analyte polarity and charge

    DEFF Research Database (Denmark)

    Domínguez, Noelia Cabaleiro; Gjelstad, Astrid; Nadal, Andrea Molina

    2012-01-01

    Electromembrane extraction (EME) at low voltage (0-15V) of 29 different basic model drug substances was investigated. The drug substances with logP......Electromembrane extraction (EME) at low voltage (0-15V) of 29 different basic model drug substances was investigated. The drug substances with logP...

  3. Artificial neural network modeling and optimization of ultrahigh pressure extraction of green tea polyphenols.

    Science.gov (United States)

    Xi, Jun; Xue, Yujing; Xu, Yinxiang; Shen, Yuhong

    2013-11-01

    In this study, the ultrahigh pressure extraction of green tea polyphenols was modeled and optimized by a three-layer artificial neural network. A feed-forward neural network trained with an error back-propagation algorithm was used to evaluate the effects of pressure, liquid/solid ratio and ethanol concentration on the total phenolic content of green tea extracts. The neural network coupled with genetic algorithms was also used to optimize the conditions needed to obtain the highest yield of tea polyphenols. The obtained optimal architecture of artificial neural network model involved a feed-forward neural network with three input neurons, one hidden layer with eight neurons and one output layer including single neuron. The trained network gave the minimum value in the MSE of 0.03 and the maximum value in the R(2) of 0.9571, which implied a good agreement between the predicted value and the actual value, and confirmed a good generalization of the network. Based on the combination of neural network and genetic algorithms, the optimum extraction conditions for the highest yield of green tea polyphenols were determined as follows: 498.8 MPa for pressure, 20.8 mL/g for liquid/solid ratio and 53.6% for ethanol concentration. The total phenolic content of the actual measurement under the optimum predicated extraction conditions was 582.4 ± 0.63 mg/g DW, which was well matched with the predicted value (597.2mg/g DW). This suggests that the artificial neural network model described in this work is an efficient quantitative tool to predict the extraction efficiency of green tea polyphenols. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  4. Potentialities and limits of QSPR and molecular modeling in the design of the extraction solvents used in hydrometallurgy

    International Nuclear Information System (INIS)

    Cote, G.; Chagnes, A.

    2010-01-01

    Due to new challenges, new extraction solvents based on innovative extractants are needed in hydrometallurgy for specific tasks. Thus, the aim of the present paper is to discuss the potentialities and limits of QSPR and molecular modeling for identifying new extractants. QSPR methods may have useful applications in such a complex problem as the design of ligands for metal separation. Nevertheless, the degree of reliability of the predictions is still limited and, in the present state of the art, these techniques are likely more useful for optimization within a given family of extractants than to build in silico new reagents. The molecular modeling techniques provide binding energies between target metals and given ligands, as well as optimized chemical structures of the formed complexes. Thus, in principle, the information which can be deduced from the molecular modeling computations is richer than that provided by QSPR methods. Nevertheless, an effort should be made to establish more tangible links between the calculated binding energies and the physical parameters used by the hydrometallurgists, such as the complexation constants in aqueous phase (β MAn ) or better the extraction constants (K ex ). (author)

  5. Modelling dental implant extraction by pullout and torque procedures.

    Science.gov (United States)

    Rittel, D; Dorogoy, A; Shemtov-Yona, K

    2017-07-01

    Dental implants extraction, achieved either by applying torque or pullout force, is used to estimate the bone-implant interfacial strength. A detailed description of the mechanical and physical aspects of the extraction process in the literature is still missing. This paper presents 3D nonlinear dynamic finite element simulations of a commercial implant extraction process from the mandible bone. Emphasis is put on the typical load-displacement and torque-angle relationships for various types of cortical and trabecular bone strengths. The simulations also study of the influence of the osseointegration level on those relationships. This is done by simulating implant extraction right after insertion when interfacial frictional contact exists between the implant and bone, and long after insertion, assuming that the implant is fully bonded to the bone. The model does not include a separate representation and model of the interfacial layer for which available data is limited. The obtained relationships show that the higher the strength of the trabecular bone the higher the peak extraction force, while for application of torque, it is the cortical bone which might dictate the peak torque value. Information on the relative strength contrast of the cortical and trabecular components, as well as the progressive nature of the damage evolution, can be revealed from the obtained relations. It is shown that full osseointegration might multiply the peak and average load values by a factor 3-12 although the calculated work of extraction varies only by a factor of 1.5. From a quantitative point of view, it is suggested that, as an alternative to reporting peak load or torque values, an average value derived from the extraction work be used to better characterize the bone-implant interfacial strength. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Modeling of the overal kinetic extraction from Maytenus aquifolia using compressed CO2

    Directory of Open Access Journals (Sweden)

    M. Minozzo

    2012-12-01

    Full Text Available In Brazil, the species Maytenus aquifolia and Maytenus ilicifolia are widely used in popular medicine in the form of teas for stomach and ulcer illness treatment. Despite the great interest in Maytenus aquifolia therapeutic properties and the fact that it is an abundant and native plant growing in Brazil, there is a lack of information in the literature concerning the extraction at high pressures. In this context, this work is focused on the mathematical modelling of the packed-bed extraction of Maytenus aquifolia with compressed CO2. Three mathematical models were used to represent the experimental data. The experiments were performed in a laboratory-scale unit, evaluating the effects of temperature (293 to 323 K, pressure (100 to 250 bar, and extraction time on the yield of the extracts. Results show that the extraction temperature and solvent density exerted a pronounced effect on yield. The mathematical model of Sovová was the most suitable to represent the experimental extraction data of M. aquifolia.

  7. Liquid-phase extraction coupled with metal-organic frameworks-based dispersive solid phase extraction of herbicides in peanuts.

    Science.gov (United States)

    Li, Na; Wang, Zhibing; Zhang, Liyuan; Nian, Li; Lei, Lei; Yang, Xiao; Zhang, Hanqi; Yu, Aimin

    2014-10-01

    Liquid-phase extraction coupled with metal-organic frameworks-based dispersive solid phase extraction was developed and applied to the extraction of pesticides in high fatty matrices. The herbicides were ultrasonically extracted from peanut using ethyl acetate as extraction solvent. The separation of the analytes from a large amount of co-extractive fat was achieved by dispersive solid-phase extraction using MIL-101(Cr) as sorbent. In this step, the analytes were adsorbed on MIL-101(Cr) and the fat remained in bulk. The herbicides were separated and determined by high-performance liquid chromatography. The experimental parameters, including type and volume of extraction solvent, ultrasonication time, volume of hexane and eluting solvent, amount of MIL-101(Cr) and dispersive solid phase extraction time, were optimized. The limits of detection for herbicides range from 0.98 to 1.9 μg/kg. The recoveries of the herbicides are in the range of 89.5-102.7% and relative standard deviations are equal or lower than 7.0%. The proposed method is simple, effective and suitable for treatment of the samples containing high content of fat. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. A distribution ratio model of strontium by crown ether extraction from simulated HLLW

    International Nuclear Information System (INIS)

    Chen Jing; Wang Qiuping; Wang Jianchen; Song Chongli

    1995-01-01

    An experiential distribution ratio model for strontium extraction by dicyclohexano-18-crown-6-n-octanol from simulated high-level waste is established. The experimental points for the model are designed by experimental homogeneous-design method. The regression of distribution ratio model of strontium is carried out by the complex-optimization method. The model is verified with experimental distribution ratio data in different extraction conditions. The results show that the relative deviations are within +-10% and the mean relative divination is 4.4% between the calculated data and the experimental ones. The experiential model together with an iteration program can be used for the strontium extraction process calculation

  9. Novel mathematic models for quantitative transitivity of quality-markers in extraction process of the Buyanghuanwu decoction.

    Science.gov (United States)

    Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long

    2018-06-01

    Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials

  10. Knocking on wood: base metal complexes as catalysts for selective oxidation of lignin models and extracts.

    Science.gov (United States)

    Hanson, Susan K; Baker, R Tom

    2015-07-21

    This work began as part of a biomass conversion catalysis project with UC Santa Barbara funded by the first NSF Chemical Bonding Center, CATSB. Recognizing that catalytic aerobic oxidation of diol C-C bonds could potentially be used to break down lignocellulose, we began to synthesize oxovanadium complexes and explore their fundamental reactivity. Of course there were theories regarding the oxidation mechanism, but our mechanistic studies soon revealed a number of surprises of the type that keep all chemists coming back to the bench! We realized that these reactions were also exciting in that they actually used the oxygen-on-every-carbon property of biomass-derived molecules to control the selectivity of the oxidation. When we found that these oxovanadium complexes tended to convert sugars predominantly to formic acid and carbon dioxide, we replaced one of the OH groups with an ether and entered the dark world of lignin chemistry. In this Account, we summarize results from our collaboration and from our individual labs. In particular, we show that oxidation selectivity (C-C vs C-O bond cleavage) of lignin models using air and vanadium complexes depends on the ancillary ligands, the reaction solvent, and the substrate structure (i.e., phenolic vs non-phenolic). Selected vanadium complexes in the presence of added base serve as effective alcohol oxidation catalysts via a novel base-assisted dehydrogenation pathway. In contrast, copper catalysts effect direct C-C bond cleavage of these lignin models, presumably through a radical pathway. The most active vanadium catalyst exhibits unique activity for the depolymerization of organosolv lignin. After Weckhuysen's excellent 2010 review on lignin valorization, the number of catalysis studies and approaches on both lignin models and extracts has expanded rapidly. Today we are seeing new start-ups and lignin production facilities sprouting up across the globe as we all work to prove wrong the old pulp and paper chemist

  11. Summary of water body extraction methods based on ZY-3 satellite

    Science.gov (United States)

    Zhu, Yu; Sun, Li Jian; Zhang, Chuan Yin

    2017-12-01

    Extracting from remote sensing images is one of the main means of water information extraction. Affected by spectral characteristics, many methods can be not applied to the satellite image of ZY-3. To solve this problem, we summarize the extraction methods for ZY-3 and analyze the extraction results of existing methods. According to the characteristics of extraction results, the method of WI& single band threshold and the method of texture filtering based on probability statistics are explored. In addition, the advantages and disadvantages of all methods are compared, which provides some reference for the research of water extraction from images. The obtained conclusions are as follows. 1) NIR has higher water sensitivity, consequently when the surface reflectance in the study area is less similar to water, using single band threshold method or multi band operation can obtain the ideal effect. 2) Compared with the water index and HIS optimal index method, object extraction method based on rules, which takes into account not only the spectral information of the water, but also space and texture feature constraints, can obtain better extraction effect, yet the image segmentation process is time consuming and the definition of the rules requires a certain knowledge. 3) The combination of the spectral relationship and water index can eliminate the interference of the shadow to a certain extent. When there is less small water or small water is not considered in further study, texture filtering based on probability statistics can effectively reduce the noises in result and avoid mixing shadows or paddy field with water in a certain extent.

  12. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang

    2014-01-01

    Full Text Available For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy.

  13. A MISO-ARX-Based Method for Single-Trial Evoked Potential Extraction

    Directory of Open Access Journals (Sweden)

    Nannan Yu

    2017-01-01

    Full Text Available In this paper, we propose a novel method for solving the single-trial evoked potential (EP estimation problem. In this method, the single-trial EP is considered as a complex containing many components, which may originate from different functional brain sites; these components can be distinguished according to their respective latencies and amplitudes and are extracted simultaneously by multiple-input single-output autoregressive modeling with exogenous input (MISO-ARX. The extraction process is performed in three stages: first, we use a reference EP as a template and decompose it into a set of components, which serve as subtemplates for the remaining steps. Then, a dictionary is constructed with these subtemplates, and EPs are preliminarily extracted by sparse coding in order to roughly estimate the latency of each component. Finally, the single-trial measurement is parametrically modeled by MISO-ARX while characterizing spontaneous electroencephalographic activity as an autoregression model driven by white noise and with each component of the EP modeled by autoregressive-moving-average filtering of the subtemplates. Once optimized, all components of the EP can be extracted. Compared with ARX, our method has greater tracking capabilities of specific components of the EP complex as each component is modeled individually in MISO-ARX. We provide exhaustive experimental results to show the effectiveness and feasibility of our method.

  14. A Novel Approach for Protein-Named Entity Recognition and Protein-Protein Interaction Extraction

    Directory of Open Access Journals (Sweden)

    Meijing Li

    2015-01-01

    Full Text Available Many researchers focus on developing protein-named entity recognition (Protein-NER or PPI extraction systems. However, the studies about these two topics cannot be merged well; then existing PPI extraction systems’ Protein-NER still needs to improve. In this paper, we developed the protein-protein interaction extraction system named PPIMiner based on Support Vector Machine (SVM and parsing tree. PPIMiner consists of three main models: natural language processing (NLP model, Protein-NER model, and PPI discovery model. The Protein-NER model, which is named ProNER, identifies the protein names based on two methods: dictionary-based method and machine learning-based method. ProNER is capable of identifying more proteins than dictionary-based Protein-NER model in other existing systems. The final discovered PPIs extracted via PPI discovery model are represented in detail because we showed the protein interaction types and the occurrence frequency through two different methods. In the experiments, the result shows that the performances achieved by our ProNER and PPI discovery model are better than other existing tools. PPIMiner applied this protein-named entity recognition approach and parsing tree based PPI extraction method to improve the performance of PPI extraction. We also provide an easy-to-use interface to access PPIs database and an online system for PPIs extraction and Protein-NER.

  15. Uniform competency-based local feature extraction for remote sensing images

    Science.gov (United States)

    Sedaghat, Amin; Mohammadi, Nazila

    2018-01-01

    Local feature detectors are widely used in many photogrammetry and remote sensing applications. The quantity and distribution of the local features play a critical role in the quality of the image matching process, particularly for multi-sensor high resolution remote sensing image registration. However, conventional local feature detectors cannot extract desirable matched features either in terms of the number of correct matches or the spatial and scale distribution in multi-sensor remote sensing images. To address this problem, this paper proposes a novel method for uniform and robust local feature extraction for remote sensing images, which is based on a novel competency criterion and scale and location distribution constraints. The proposed method, called uniform competency (UC) local feature extraction, can be easily applied to any local feature detector for various kinds of applications. The proposed competency criterion is based on a weighted ranking process using three quality measures, including robustness, spatial saliency and scale parameters, which is performed in a multi-layer gridding schema. For evaluation, five state-of-the-art local feature detector approaches, namely, scale-invariant feature transform (SIFT), speeded up robust features (SURF), scale-invariant feature operator (SFOP), maximally stable extremal region (MSER) and hessian-affine, are used. The proposed UC-based feature extraction algorithms were successfully applied to match various synthetic and real satellite image pairs, and the results demonstrate its capability to increase matching performance and to improve the spatial distribution. The code to carry out the UC feature extraction is available from href="https://www.researchgate.net/publication/317956777_UC-Feature_Extraction.

  16. MODELING AND SIMULATION OF A BENZENE RECOVERY PROCESS BY EXTRACTIVE DISTILLATION

    Directory of Open Access Journals (Sweden)

    L. B. Brondani

    2015-03-01

    Full Text Available Abstract Extractive distillation processes with N-formylmorpholine (NFM are used industrially to separate benzene from six carbon non-aromatics. In the process studied in this work, the stream of interest consists of nearly 20 different hydrocarbons. A new set of NRTL parameters was correlated based on literature experimental data. Both vapor-liquid equilibrium as well as infinite dilution activity coefficient data were taken into account; missing parameters were estimated with the UNIFAC group contribution model. The extractive distillation process was simulated using ASPEN Plus®. Very good agreement with plant data was obtained. The influences of the main operational parameters, solvent to feed ratio and solvent temperature, were studied. Theoretical optimum operating values were obtained and can be implemented to improve the industrial process. Extreme static sensitivity with respect to reboiler heat was observed, indicating that this can be the source of instabilities.

  17. A MULTI-AGENT BASED SOCIAL CRM FRAMEWORK FOR EXTRACTING AND ANALYSING OPINIONS

    Directory of Open Access Journals (Sweden)

    ABDELAZIZ EL FAZZIKI

    2017-08-01

    Full Text Available Social media provide a wide space for people from around the world to communicate, share knowledge and personal experiences. They increasingly become an important data source for opinion mining and sentiment analysis, thanks to shared comments and reviews about products and services. And companies are showing a growing interest to harness their potential, in order to support setting up marketing strategies. Despite the importance of sentiment analysis in decision making, there is a lack of social intelligence integration at the level of customer relationship management systems. Thus, social customer relationship management (SCRM systems have become an interesting research area. However, they need deep analytic techniques to transform the large amount of data “Big Data” into actionable insights. Such systems also require an advanced modelling and data processing methods, and must consider the emerging paradigm related to proactive systems. In this paper, we propose an agent based social framework that extracts and consolidates the reviews expressed via social media, in order to help enterprises know more about customers’ opinions toward a particular product or service. To illustrate our approach, we present the case study of Twitter reviews that we use to extract opinions and sentiment about a set of products using SentiGem API. Data extraction, analysis and storage are performed using a framework based on Hadoop MapReduce and HBase.

  18. A new extraction method of loess shoulder-line based on Marr-Hildreth operator and terrain mask.

    Directory of Open Access Journals (Sweden)

    Sheng Jiang

    Full Text Available Loess shoulder-lines are significant structural lines which divide the complicated loess landform into loess interfluves and gully-slope lands. Existing extraction algorithms for shoulder-lines mainly are based on local maximum of terrain features. These algorithms are sensitive to noise for complicated loess surface and the extraction parameters are difficult to be determined, making the extraction results usually inaccurate. This paper presents a new extraction approach for loess shoulder-lines, in which Marr-Hildreth edge operator is employed to construct initial shoulder-lines. Then the terrain mask for confining the boundary of shoulder-lines is proposed based on slope degree classification and morphology methods, avoiding interference from non-valley area and modify the initial loess shoulder-lines. A case study is conducted in Yijun located in the northern Shanxi Loess Plateau of China. The Digital Elevation Models with a grid size of 5 m is applied as original data. To obtain optimal scale parameters, the Euclidean Distance Offset Percentages between shoulder-lines is calculated by the Marr-Hildreth operator and the manual delineations. The experimental results show that the new method could achieve the highest extraction accuracy when σ = 5 in Gaussian smoothing. According to the accuracy assessment, the average extraction accuracy is about 88.5%, which indicates that the proposed method is applicable for the extraction of loess shoulder-lines in the loess hilly and gully areas.

  19. Physically based model for extracting dual permeability parameters using non-Newtonian fluids

    Science.gov (United States)

    Abou Najm, M. R.; Basset, C.; Stewart, R. D.; Hauswirth, S.

    2017-12-01

    Dual permeability models are effective for the assessment of flow and transport in structured soils with two dominant structures. The major challenge to those models remains in the ability to determine appropriate and unique parameters through affordable, simple, and non-destructive methods. This study investigates the use of water and a non-Newtonian fluid in saturated flow experiments to derive physically-based parameters required for improved flow predictions using dual permeability models. We assess the ability of these two fluids to accurately estimate the representative pore sizes in dual-domain soils, by determining the effective pore sizes of macropores and micropores. We developed two sub-models that solve for the effective macropore size assuming either cylindrical (e.g., biological pores) or planar (e.g., shrinkage cracks and fissures) pore geometries, with the micropores assumed to be represented by a single effective radius. Furthermore, the model solves for the percent contribution to flow (wi) corresponding to the representative macro and micro pores. A user-friendly solver was developed to numerically solve the system of equations, given that relevant non-Newtonian viscosity models lack forms conducive to analytical integration. The proposed dual-permeability model is a unique attempt to derive physically based parameters capable of measuring dual hydraulic conductivities, and therefore may be useful in reducing parameter uncertainty and improving hydrologic model predictions.

  20. Tritium control in fusion reactor materials: A model for Tritium Extracting System

    International Nuclear Information System (INIS)

    Zucchetti, Massimo; Utili, Marco; Nicolotti, Iuri; Ying, Alice; Franza, Fabrizio; Abdou, Mohamed

    2015-01-01

    Highlights: • A modeling work has been performed to address these issues in view of its utilization for the TES (Tritium Extraction System), in the case of the HCPB TBM and for a Molecular sieve as adsorbent material. • A computational model has been setup and tested in this paper. • The results of experimental measurement of fundamental parameters such as mass transfer coefficients have been implemented in the model. • It turns out the capability to model the extraction process of gaseous tritium compounds and to estimate the breakthrough curves of the two main tritium gaseous species (H2 and HT). - Abstract: In fusion reactors, tritium is bred by lithium isotopes inside the blanket and then extracted. However, tritium can contaminate the reactor structures, and can be eventually released into the environment. Tritium in reactor components should therefore be kept under close control throughout the fusion reactor lifetime, bearing in mind the risk of accidents, the need for maintenance and the detritiation of dismantled reactor components before their re-use or disposal. A modeling work has been performed to address these issues in view of its utilization for the TES (Tritium Extraction System), in the case of the HCPB TBM and for a molecular sieve as adsorbent material. A computational model has been setup and tested. The results of experimental measurement of fundamental parameters such as mass transfer coefficients have been implemented in the model. It turns out the capability of the model to describe the extraction process of gaseous tritium compounds and to estimate the breakthrough curves of the two main tritium gaseous species (H2 and HT).

  1. Tritium control in fusion reactor materials: A model for Tritium Extracting System

    Energy Technology Data Exchange (ETDEWEB)

    Zucchetti, Massimo [DENERG, Politecnico di Torino (Italy); Utili, Marco, E-mail: marco.utili@enea.it [ENEA UTIS – C.R. Brasimone, Bacino del Brasimone, Camugnano, BO (Italy); Nicolotti, Iuri [DENERG, Politecnico di Torino (Italy); Ying, Alice [University of California Los Angeles (UCLA), Los Angeles, CA (United States); Franza, Fabrizio [Karlsruhe Institute of Technology, Karlsruhe (Germany); Abdou, Mohamed [University of California Los Angeles (UCLA), Los Angeles, CA (United States)

    2015-10-15

    Highlights: • A modeling work has been performed to address these issues in view of its utilization for the TES (Tritium Extraction System), in the case of the HCPB TBM and for a Molecular sieve as adsorbent material. • A computational model has been setup and tested in this paper. • The results of experimental measurement of fundamental parameters such as mass transfer coefficients have been implemented in the model. • It turns out the capability to model the extraction process of gaseous tritium compounds and to estimate the breakthrough curves of the two main tritium gaseous species (H2 and HT). - Abstract: In fusion reactors, tritium is bred by lithium isotopes inside the blanket and then extracted. However, tritium can contaminate the reactor structures, and can be eventually released into the environment. Tritium in reactor components should therefore be kept under close control throughout the fusion reactor lifetime, bearing in mind the risk of accidents, the need for maintenance and the detritiation of dismantled reactor components before their re-use or disposal. A modeling work has been performed to address these issues in view of its utilization for the TES (Tritium Extraction System), in the case of the HCPB TBM and for a molecular sieve as adsorbent material. A computational model has been setup and tested. The results of experimental measurement of fundamental parameters such as mass transfer coefficients have been implemented in the model. It turns out the capability of the model to describe the extraction process of gaseous tritium compounds and to estimate the breakthrough curves of the two main tritium gaseous species (H2 and HT).

  2. Bearing Degradation Process Prediction Based on the Support Vector Machine and Markov Model

    Directory of Open Access Journals (Sweden)

    Shaojiang Dong

    2014-01-01

    Full Text Available Predicting the degradation process of bearings before they reach the failure threshold is extremely important in industry. This paper proposed a novel method based on the support vector machine (SVM and the Markov model to achieve this goal. Firstly, the features are extracted by time and time-frequency domain methods. However, the extracted original features are still with high dimensional and include superfluous information, and the nonlinear multifeatures fusion technique LTSA is used to merge the features and reduces the dimension. Then, based on the extracted features, the SVM model is used to predict the bearings degradation process, and the CAO method is used to determine the embedding dimension of the SVM model. After the bearing degradation process is predicted by SVM model, the Markov model is used to improve the prediction accuracy. The proposed method was validated by two bearing run-to-failure experiments, and the results proved the effectiveness of the methodology.

  3. Field—Based Supercritical Fluid Extraction of Hydrocarbons at Industrially Contaminated Sites

    Directory of Open Access Journals (Sweden)

    Peggy Rigou

    2002-01-01

    Full Text Available Examination of organic pollutants in groundwaters should also consider the source of the pollution, which is often a solid matrix such as soil, landfill waste, or sediment. This premise should be viewed alongside the growing trend towards field-based characterisation of contaminated sites for reasons of speed and cost. Field-based methods for the extraction of organic compounds from solid samples are generally cumbersome, time consuming, or inefficient. This paper describes the development of a field-based supercritical fluid extraction (SFE system for the recovery of organic contaminants (benzene, toluene, ethylbenzene, and xylene and polynuclear aromatic hydrocarbons from soils. A simple, compact, and robust SFE system has been constructed and was found to offer the same extraction efficiency as a well-established laboratory SFE system. Extraction optimisation was statistically evaluated using a factorial analysis procedure. Under optimised conditions, the device yielded recovery efficiencies of >70% with RSD values of 4% against the standard EPA Soxhlet method, compared with a mean recovery efficiency of 48% for a commercially available field-extraction kit. The device will next be evaluated with real samples prior to field deployment.

  4. Tie Points Extraction for SAR Images Based on Differential Constraints

    Science.gov (United States)

    Xiong, X.; Jin, G.; Xu, Q.; Zhang, H.

    2018-04-01

    Automatically extracting tie points (TPs) on large-size synthetic aperture radar (SAR) images is still challenging because the efficiency and correct ratio of the image matching need to be improved. This paper proposes an automatic TPs extraction method based on differential constraints for large-size SAR images obtained from approximately parallel tracks, between which the relative geometric distortions are small in azimuth direction and large in range direction. Image pyramids are built firstly, and then corresponding layers of pyramids are matched from the top to the bottom. In the process, the similarity is measured by the normalized cross correlation (NCC) algorithm, which is calculated from a rectangular window with the long side parallel to the azimuth direction. False matches are removed by the differential constrained random sample consensus (DC-RANSAC) algorithm, which appends strong constraints in azimuth direction and weak constraints in range direction. Matching points in the lower pyramid images are predicted with the local bilinear transformation model in range direction. Experiments performed on ENVISAT ASAR and Chinese airborne SAR images validated the efficiency, correct ratio and accuracy of the proposed method.

  5. New approaches in agent-based modeling of complex financial systems

    Science.gov (United States)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  6. Chinese License Plates Recognition Method Based on A Robust and Efficient Feature Extraction and BPNN Algorithm

    Science.gov (United States)

    Zhang, Ming; Xie, Fei; Zhao, Jing; Sun, Rui; Zhang, Lei; Zhang, Yue

    2018-04-01

    The prosperity of license plate recognition technology has made great contribution to the development of Intelligent Transport System (ITS). In this paper, a robust and efficient license plate recognition method is proposed which is based on a combined feature extraction model and BPNN (Back Propagation Neural Network) algorithm. Firstly, the candidate region of the license plate detection and segmentation method is developed. Secondly, a new feature extraction model is designed considering three sets of features combination. Thirdly, the license plates classification and recognition method using the combined feature model and BPNN algorithm is presented. Finally, the experimental results indicate that the license plate segmentation and recognition both can be achieved effectively by the proposed algorithm. Compared with three traditional methods, the recognition accuracy of the proposed method has increased to 95.7% and the consuming time has decreased to 51.4ms.

  7. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  8. Glycerol-based deep eutectic solvents as extractants for the separation of MEK and ethanol via liquid-liquid extraction

    NARCIS (Netherlands)

    Rodriguez, N.R.; Ferré Güell, J.; Kroon, M.C.

    2016-01-01

    Four different glycerol-based deep eutectic solvents (DESs) were tested as extracting agents for the separation of the azeotropic mixture {methyl ethyl ketone + ethanol} via liquid-liquid extraction. The selected DESs for this work were: glycerol/choline chloride with molar ratios (4:1) and (2:1),

  9. An Application for Data Preprocessing and Models Extractions in Web Usage Mining

    Directory of Open Access Journals (Sweden)

    Claudia Elena DINUCA

    2011-11-01

    Full Text Available Web servers worldwide generate a vast amount of information on web users’ browsing activities. Several researchers have studied these so-called clickstream or web access log data to better understand and characterize web users. The goal of this application is to analyze user behaviour by mining enriched web access log data. With the continued growth and proliferation of e-commerce, Web services, and Web-based information systems, the volumes of click stream and user data collected by Web-based organizations in their daily operations has reached astronomical proportions. This information can be exploited in various ways, such as enhancing the effectiveness of websites or developing directed web marketing campaigns. The discovered patterns are usually represented as collections of pages, objects, or re-sources that are frequently accessed by groups of users with common needs or interests. In this paper we will focus on displaying the way how it was implemented the application for data preprocessing and extracting different data models from web logs data, finding association as a data mining technique to extract potentially useful knowledge from web usage data. We find different data models navigation patterns by analysing the log files of the web-site. I implemented the application in Java using NetBeans IDE. For exemplification, I used the log files data from a commercial web site www.nice-layouts.com.

  10. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A statistical method for model extraction and model selection applied to the temperature scaling of the L–H transition

    International Nuclear Information System (INIS)

    Peluso, E; Gelfusa, M; Gaudio, P; Murari, A

    2014-01-01

    Access to the H mode of confinement in tokamaks is characterized by an abrupt transition, which has been the subject of continuous investigation for decades. Various theoretical models have been developed and multi-machine databases of experimental data have been collected. In this paper, a new methodology is reviewed for the investigation of the scaling laws for the temperature threshold to access the H mode. The approach is based on symbolic regression via genetic programming and allows first the extraction of the most statistically reliable models from the available experimental data. Nonlinear fitting is then applied to the mathematical expressions found by symbolic regression; this second step permits to easily compare the quality of the data-driven scalings with the most widely accepted theoretical models. The application of a complete set of statistical indicators shows that the data-driven scaling laws are qualitatively better than the theoretical models. The main limitations of the theoretical models are that they are all expressed as power laws, which are too rigid to fit the available experimental data and to extrapolate to ITER. The proposed method is absolutely general and can be applied to the extraction or scaling law from any experimental database of sufficient statistical relevance. (paper)

  12. Mathematical model for optimizing the design extraction pressure of a condensation turbine with district heat extraction. Mathematisches Modell zur Optimierung des Auslegungsentnahmedruckes an einer Kondensationsturbine mit Fernwaermeauskopplung

    Energy Technology Data Exchange (ETDEWEB)

    Grkovic, V [Novi Sad Univ. (Yugoslavia)

    1991-11-01

    A mathematical calculation model is explained, which enables optimization of the design pressure at the steam extraction point of a condensation extraction turbine. The results obtained show that the additional thermodynamic losses, which occur during turbine operation on variation of the heat load, can be reduced to a minimum by optimization of the extraction pressure. The optimal pressures at the extraction point, as well as the size of the economic effect are dependent on the selected technical design of the turbine, its maximum heat output and the basic heat load factor. (orig.).

  13. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  14. Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data

    Science.gov (United States)

    Parida, G.; Rajan, K. S.

    2017-05-01

    The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  15. LOCALIZED SEGMENT BASED PROCESSING FOR AUTOMATIC BUILDING EXTRACTION FROM LiDAR DATA

    Directory of Open Access Journals (Sweden)

    G. Parida

    2017-05-01

    Full Text Available The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  16. Empirical modeling of solvent extraction of uranium from sulphuric acid medium using PC-88A and TOPO as extractants

    International Nuclear Information System (INIS)

    Biswas, S.; Roy, S.B.; Pathak, P.N.; Manchanda, V.K.; Singh, D.K.

    2011-01-01

    Extraction behavior of uranium (VI) from sulphuric acid medium using PC-88A (H 2 A 2 , dimer form) and a mixture of PC-88A + TOPO in n-dodecane has been investigated. The extraction data have been used to develop a mathematical model correlating percentage extraction (%E) with PC-88A and TOPO concentration. It can be used to predict the steady-state concentrations of metal ion under the conditions of the present work. (author)

  17. Extraction and representation of common feature from uncertain facial expressions with cloud model.

    Science.gov (United States)

    Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing

    2017-12-01

    Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.

  18. Mode extraction on wind turbine blades via phase-based video motion estimation

    Science.gov (United States)

    Sarrafi, Aral; Poozesh, Peyman; Niezrecki, Christopher; Mao, Zhu

    2017-04-01

    In recent years, image processing techniques are being applied more often for structural dynamics identification, characterization, and structural health monitoring. Although as a non-contact and full-field measurement method, image processing still has a long way to go to outperform other conventional sensing instruments (i.e. accelerometers, strain gauges, laser vibrometers, etc.,). However, the technologies associated with image processing are developing rapidly and gaining more attention in a variety of engineering applications including structural dynamics identification and modal analysis. Among numerous motion estimation and image-processing methods, phase-based video motion estimation is considered as one of the most efficient methods regarding computation consumption and noise robustness. In this paper, phase-based video motion estimation is adopted for structural dynamics characterization on a 2.3-meter long Skystream wind turbine blade, and the modal parameters (natural frequencies, operating deflection shapes) are extracted. Phase-based video processing adopted in this paper provides reliable full-field 2-D motion information, which is beneficial for manufacturing certification and model updating at the design stage. The phase-based video motion estimation approach is demonstrated through processing data on a full-scale commercial structure (i.e. a wind turbine blade) with complex geometry and properties, and the results obtained have a good correlation with the modal parameters extracted from accelerometer measurements, especially for the first four bending modes, which have significant importance in blade characterization.

  19. Statistical model of planning technological indicators for oil extraction

    Energy Technology Data Exchange (ETDEWEB)

    Galeyev, R G; Lavushchenko, V P; Sheshnev, A S

    1979-01-01

    The efficiency of the process of oil extraction is determined by the effect of a number of interrelated technological indicators. Analytical expression of the interrelationships of the indicators was represented by an econometric model consisting of a system of linear regression equations. The basic advantage of these models is the possibility of calculating in them different, significantly important interrelationships. This makes it possible to correlate all calculations into a single logically noncontradictory balanced system. The developed model of the technological process of oil extraction makes it possible to significantly facilitate calculation and planning of its basic indicators with regard for system and balance requirements, makes it possible to purposefully generate new variants. In this case because of the optimal distribution of the volumes of geological-technical measures, a decrease in the total outlays for their implementation is achieved. Thus for the Berezovskiy field, this saving was R 150,000.

  20. Response Surface Optimization of Rotenone Using Natural Alcohol-Based Deep Eutectic Solvent as Additive in the Extraction Medium Cocktail

    Directory of Open Access Journals (Sweden)

    Zetty Shafiqa Othman

    2017-01-01

    Full Text Available Rotenone is a biopesticide with an amazing effect on aquatic life and insect pests. In Asia, it can be isolated from Derris species roots (Derris elliptica and Derris malaccensis. The previous study revealed the comparable efficiency of alcohol-based deep eutectic solvent (DES in extracting a high yield of rotenone (isoflavonoid to binary ionic liquid solvent system ([BMIM]OTf and organic solvent (acetone. Therefore, this study intends to analyze the optimum parameters (solvent ratio, extraction time, and agitation rate in extracting the highest yield of rotenone extract at a much lower cost and in a more environmental friendly method by using response surface methodology (RSM based on central composite rotatable design (CCRD. By using RSM, linear polynomial equations were obtained for predicting the concentration and yield of rotenone extracted. The verification experiment confirmed the validity of both of the predicted models. The results revealed that the optimum conditions for solvent ratio, extraction time, and agitation rate were 2 : 8 (DES : acetonitrile, 19.34 hours, and 199.32 rpm, respectively. At the optimum condition of the rotenone extraction process using DES binary solvent system, this resulted in a 3.5-fold increase in a rotenone concentration of 0.49 ± 0.07 mg/ml and yield of 0.35 ± 0.06 (%, w/w as compared to the control extract (acetonitrile only. In fact, the rotenone concentration and yield were significantly influenced by binary solvent ratio and extraction time (P<0.05 but not by means of agitation rate. For that reason, the optimal extraction condition using alcohol-based deep eutectic solvent (DES as a green additive in the extraction medium cocktail has increased the potential of enhancing the rotenone concentration and yield extracted.

  1. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes

    International Nuclear Information System (INIS)

    Mousavi, Fatemeh; Pawliszyn, Janusz

    2013-01-01

    Graphical abstract: -- Highlights: •Silica-based 1-vinyl-3-octadecylimidazolium bromide ionic liquid was synthesized and characterized. •The synthesized polymer was immobilized on the stainless steel blade using polyacrylonitrile glue. •SiImC 18 -PAN 96-blade SPME was applied as an extraction phase for extraction of highly polar compounds in grape matrix. •This system provides high extraction efficiency and reproducibility for up to 50 extractions from tartaric buffer and 20 extractions from grape pulp. -- Abstract: 1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C 18 VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC 18 ) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), 13 C NMR and 29 Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC–MS–MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270 min for 96 samples simultaneously (60 min preconditioning, 90 min extraction, 60 min desorption and 60 min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5–13 and 3–10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC–MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0 μg L −1 , respectively. Standard addition calibration was applied for quantitative

  2. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes

    Energy Technology Data Exchange (ETDEWEB)

    Mousavi, Fatemeh; Pawliszyn, Janusz, E-mail: janusz@uwaterloo.ca

    2013-11-25

    Graphical abstract: -- Highlights: •Silica-based 1-vinyl-3-octadecylimidazolium bromide ionic liquid was synthesized and characterized. •The synthesized polymer was immobilized on the stainless steel blade using polyacrylonitrile glue. •SiImC{sub 18}-PAN 96-blade SPME was applied as an extraction phase for extraction of highly polar compounds in grape matrix. •This system provides high extraction efficiency and reproducibility for up to 50 extractions from tartaric buffer and 20 extractions from grape pulp. -- Abstract: 1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C{sub 18}VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC{sub 18}) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), {sup 13}C NMR and {sup 29}Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC–MS–MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270 min for 96 samples simultaneously (60 min preconditioning, 90 min extraction, 60 min desorption and 60 min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5–13 and 3–10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC–MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0 μg L{sup −1}, respectively. Standard addition calibration was

  3. Extracting Date/Time Expressions in Super-Function Based Japanese-English Machine Translation

    Science.gov (United States)

    Sasayama, Manabu; Kuroiwa, Shingo; Ren, Fuji

    Super-Function Based Machine Translation(SFBMT) which is a type of Example-Based Machine Translation has a feature which makes it possible to expand the coverage of examples by changing nouns into variables, however, there were problems extracting entire date/time expressions containing parts-of-speech other than nouns, because only nouns/numbers were changed into variables. We describe a method for extracting date/time expressions for SFBMT. SFBMT uses noun determination rules to extract nouns and a bilingual dictionary to obtain correspondence of the extracted nouns between the source and the target languages. In this method, we add a rule to extract date/time expressions and then extract date/time expressions from a Japanese-English bilingual corpus. The evaluation results shows that the precision of this method for Japanese sentences is 96.7%, with a recall of 98.2% and the precision for English sentences is 94.7%, with a recall of 92.7%.

  4. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    OpenAIRE

    U. Mallast; R. Gloaguen; S. Geyer; T. Rödiger; C. Siebert

    2011-01-01

    In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxili...

  5. Autonomous celestial navigation based on Earth ultraviolet radiance and fast gradient statistic feature extraction

    Science.gov (United States)

    Lu, Shan; Zhang, Hanmo

    2016-01-01

    To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.

  6. Integrating fuzzy object based image analysis and ant colony optimization for road extraction from remotely sensed images

    Science.gov (United States)

    Maboudi, Mehdi; Amini, Jalal; Malihi, Shirin; Hahn, Michael

    2018-04-01

    Updated road network as a crucial part of the transportation database plays an important role in various applications. Thus, increasing the automation of the road extraction approaches from remote sensing images has been the subject of extensive research. In this paper, we propose an object based road extraction approach from very high resolution satellite images. Based on the object based image analysis, our approach incorporates various spatial, spectral, and textural objects' descriptors, the capabilities of the fuzzy logic system for handling the uncertainties in road modelling, and the effectiveness and suitability of ant colony algorithm for optimization of network related problems. Four VHR optical satellite images which are acquired by Worldview-2 and IKONOS satellites are used in order to evaluate the proposed approach. Evaluation of the extracted road networks shows that the average completeness, correctness, and quality of the results can reach 89%, 93% and 83% respectively, indicating that the proposed approach is applicable for urban road extraction. We also analyzed the sensitivity of our algorithm to different ant colony optimization parameter values. Comparison of the achieved results with the results of four state-of-the-art algorithms and quantifying the robustness of the fuzzy rule set demonstrate that the proposed approach is both efficient and transferable to other comparable images.

  7. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  8. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  9. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  10. Hierarchical graph-based segmentation for extracting road networks from high-resolution satellite images

    Science.gov (United States)

    Alshehhi, Rasha; Marpu, Prashanth Reddy

    2017-04-01

    Extraction of road networks in urban areas from remotely sensed imagery plays an important role in many urban applications (e.g. road navigation, geometric correction of urban remote sensing images, updating geographic information systems, etc.). It is normally difficult to accurately differentiate road from its background due to the complex geometry of the buildings and the acquisition geometry of the sensor. In this paper, we present a new method for extracting roads from high-resolution imagery based on hierarchical graph-based image segmentation. The proposed method consists of: 1. Extracting features (e.g., using Gabor and morphological filtering) to enhance the contrast between road and non-road pixels, 2. Graph-based segmentation consisting of (i) Constructing a graph representation of the image based on initial segmentation and (ii) Hierarchical merging and splitting of image segments based on color and shape features, and 3. Post-processing to remove irregularities in the extracted road segments. Experiments are conducted on three challenging datasets of high-resolution images to demonstrate the proposed method and compare with other similar approaches. The results demonstrate the validity and superior performance of the proposed method for road extraction in urban areas.

  11. AUTOMATIC SHAPE-BASED TARGET EXTRACTION FOR CLOSE-RANGE PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    X. Guo

    2016-06-01

    Full Text Available In order to perform precise identification and location of artificial coded targets in natural scenes, a novel design of circle-based coded target and the corresponding coarse-fine extraction algorithm are presented. The designed target separates the target box and coding box totally and owns an advantage of rotation invariance. Based on the original target, templates are prepared by three geometric transformations and are used as the input of shape-based template matching. Finally, region growing and parity check methods are used to extract the coded targets as final results. No human involvement is required except for the preparation of templates and adjustment of thresholds in the beginning, which is conducive to the automation of close-range photogrammetry. The experimental results show that the proposed recognition method for the designed coded target is robust and accurate.

  12. Modeling and analysis of power extraction circuits for passive UHF RFID applications

    International Nuclear Information System (INIS)

    Fan Bo; Dai Yujie; Zhang Xiaoxing; Lue Yingjie

    2009-01-01

    Modeling and analysis of far field power extraction circuits for passive UHF RF identification (RFID) applications are presented. A mathematical model is derived to predict the complex nonlinear performance of UHF voltage multiplier using Schottky diodes. To reduce the complexity of the proposed model, a simple linear approximation for Schottky diode is introduced. Measurement results show considerable agreement with the values calculated by the proposed model. With the derived model, optimization on stage number for voltage multiplier to achieve maximum power conversion efficiency is discussed. Furthermore, according to the Bode-Fano criterion and the proposed model, a limitation on maximum power up range for passive UHF RFID power extraction circuits is also studied.

  13. A numerical study of EGS heat extraction process based on a thermal non-equilibrium model for heat transfer in subsurface porous heat reservoir

    Science.gov (United States)

    Chen, Jiliang; Jiang, Fangming

    2016-02-01

    With a previously developed numerical model, we perform a detailed study of the heat extraction process in enhanced or engineered geothermal system (EGS). This model takes the EGS subsurface heat reservoir as an equivalent porous medium while it considers local thermal non-equilibrium between the rock matrix and the fluid flowing in the fractured rock mass. The application of local thermal non-equilibrium model highlights the temperature-difference heat exchange process occurring in EGS reservoirs, enabling a better understanding of the involved heat extraction process. The simulation results unravel the mechanism of preferential flow or short-circuit flow forming in homogeneously fractured reservoirs of different permeability values. EGS performance, e.g. production temperature and lifetime, is found to be tightly related to the flow pattern in the reservoir. Thermal compensation from rocks surrounding the reservoir contributes little heat to the heat transmission fluid if the operation time of an EGS is shorter than 15 years. We find as well the local thermal equilibrium model generally overestimates EGS performance and for an EGS with better heat exchange conditions in the heat reservoir, the heat extraction process acts more like the local thermal equilibrium process.

  14. One-step extraction of polar drugs from plasma by Parallel Artificial Liquid Membrane Extraction

    DEFF Research Database (Denmark)

    Pilařová, Veronika; Sultani, Mumtaz; Ask, Kristine Skoglund

    2017-01-01

    in the pores of a thin polymeric membrane, a well-known extraction principle also used in hollow fiber liquid-phase microextraction (HF-LPME). However, the new PALME technique offers a more user-friendly setup in which the supported liquid membrane is incorporated in a 96 well plate system. Thus, high......The new microextraction technique named parallel artificial liquid membrane extraction (PALME) was introduced as an alternative approach to liquid-liquid extraction of charged analytes from aqueous samples. The concept is based on extraction of analytes across a supported liquid membrane sustained...... for extraction of polar basic drugs was developed in the present work. The basic drugs hydralazine, ephedrine, metaraminol, salbutamol, and cimetidine were used as model analytes, and were extracted from alkalized human plasma into an aqueous solution via the supported liquid membrane. The extraction...

  15. Research on the method of extracting DEM based on GBInSAR

    Science.gov (United States)

    Yue, Jianping; Yue, Shun; Qiu, Zhiwei; Wang, Xueqin; Guo, Leping

    2016-05-01

    Precise topographical information has a very important role in geology, hydrology, natural resources survey and deformation monitoring. The extracting DEM technology based on synthetic aperture radar interferometry (InSAR) obtains the three-dimensional elevation of the target area through the phase information of the radar image data. The technology has large-scale, high-precision, all-weather features. By changing track in the location of the ground radar system up and down, it can form spatial baseline. Then we can achieve the DEM of the target area by acquiring image data from different angles. Three-dimensional laser scanning technology can quickly, efficiently and accurately obtain DEM of target area, which can verify the accuracy of DEM extracted by GBInSAR. But research on GBInSAR in extracting DEM of the target area is a little. For lack of theory and lower accuracy problems in extracting DEM based on GBInSAR now, this article conducted research and analysis on its principle deeply. The article extracted the DEM of the target area, combined with GBInSAR data. Then it compared the DEM obtained by GBInSAR with the DEM obtained by three-dimensional laser scan data and made statistical analysis and normal distribution test. The results showed the DEM obtained by GBInSAR was broadly consistent with the DEM obtained by three-dimensional laser scanning. And its accuracy is high. The difference of both DEM approximately obeys normal distribution. It indicated that extracting the DEM of target area based on GBInSAR is feasible and provided the foundation for the promotion and application of GBInSAR.

  16. TIE POINTS EXTRACTION FOR SAR IMAGES BASED ON DIFFERENTIAL CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    X. Xiong

    2018-04-01

    Full Text Available Automatically extracting tie points (TPs on large-size synthetic aperture radar (SAR images is still challenging because the efficiency and correct ratio of the image matching need to be improved. This paper proposes an automatic TPs extraction method based on differential constraints for large-size SAR images obtained from approximately parallel tracks, between which the relative geometric distortions are small in azimuth direction and large in range direction. Image pyramids are built firstly, and then corresponding layers of pyramids are matched from the top to the bottom. In the process, the similarity is measured by the normalized cross correlation (NCC algorithm, which is calculated from a rectangular window with the long side parallel to the azimuth direction. False matches are removed by the differential constrained random sample consensus (DC-RANSAC algorithm, which appends strong constraints in azimuth direction and weak constraints in range direction. Matching points in the lower pyramid images are predicted with the local bilinear transformation model in range direction. Experiments performed on ENVISAT ASAR and Chinese airborne SAR images validated the efficiency, correct ratio and accuracy of the proposed method.

  17. Determination of species activities in organic phase. Modelling of liquid-liquid extraction system using uniquac and unifac models; Determination des activites des especes en phase organique. Application d`uniquac et unifac a la modelisation des systemes d`extraction liquide-liquide

    Energy Technology Data Exchange (ETDEWEB)

    Rat, B. [CEA Saclay, 91 - Gif-sur-Yvette (France). Dept. de Recherche en Retraitement et en Vitrification]|[Paris-6 Univ., 75 (France)

    1998-12-31

    The aim of nuclear fuel reprocessing is to separate reusable elements, uranium and plutonium from the other elements, fission products and minor actinides. PUREX process uses liquid-liquid extraction as separation method. Numerical codes for modelling the extraction operations of PUREX process use a semi-empirical model to represent the partition of species. To improve the precision and precision and predictive nature of the models, we looked for a theoretical tool which permits to quantify medium effects, especially in the organic phase, for which few models are available. The Sergeivskii-Dannus model permits to quantify deviations from ideality in organic phase equilibrated with aqueous phase, but with parameters depending on extractant/diluent ratio. We decided to investigate UNIQUAC and UNIFAC models which permit to estimate activity coefficients in non-electrolytic phases taking account of the mutual interactions of molecules and their morphology. UNIFAC is based on UNIQUAC but molecules are considered as structural groups assemblies. Before applying these model to extraction systems, we investigate their abilities to describe simple systems, binary and ternary systems. UNIQUAC has been applied to TBP/diluent mixtures and permits to estimate activity coefficients for diluents whose interactions with TPB are very different in nature and strength. Group contribution (UNIFAC) applied to TBP/alkane mixtures permits to represent the effect of lengthening alkane chain but not the effect of branching. UNIQUAC fails to describe the TBP/diluent/water/non-extractable-salt systems in case of strong TBP diluent interactions. In order to obtain a correct description of these systems, we used the Chem-UNIFAC model, where the INIFAC equation is supplemented with chemical equilibria allowing explicitly for complexes formation and where group contribution is used to describes complexes. We have with Chem-UNIFAC a model available which can take the effect of the diluent into

  18. Asymptotic analysis of an ion extraction model

    International Nuclear Information System (INIS)

    Ben Abdallah, N.; Mas-Gallic, S.; Raviart, P.A.

    1993-01-01

    A simple model for ion extraction from a plasma is analyzed. The order of magnitude of the plasma parameters leads to a singular perturbation problem for a semilinear elliptic equation. We first prove existence of solutions for the perturbed problem and uniqueness under certain conditions. Then we prove the convergence of these solutions, when the parameters go to zero, towards the solution of a Child-Langmuir problem

  19. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  20. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  1. A green deep eutectic solvent-based aqueous two-phase system for protein extracting

    International Nuclear Information System (INIS)

    Xu, Kaijia; Wang, Yuzhi; Huang, Yanhua; Li, Na; Wen, Qian

    2015-01-01

    Highlights: • A strategy for the protein purification with a deep eutectic solvent(DES)-based aqueous two-phase system. • Choline chloride-glycerin DES was selected as the extraction solvent. • Bovine serum albumin and trypsin were used as the analytes. • Aggregation phenomenon was detected in the mechanism research. - Abstract: As a new type of green solvent, deep eutectic solvent (DES) has been applied for the extraction of proteins with an aqueous two-phase system (ATPS) in this work. Four kinds of choline chloride (ChCl)-based DESs were synthesized to extract bovine serum albumin (BSA), and ChCl-glycerol was selected as the suitable extraction solvent. Single factor experiments have been done to investigate the effects of the extraction process, including the amount of DES, the concentration of salt, the mass of protein, the shaking time, the temperature and PH value. Experimental results show 98.16% of the BSA could be extracted into the DES-rich phase in a single-step extraction under the optimized conditions. A high extraction efficiency of 94.36% was achieved, while the conditions were applied to the extraction of trypsin (Try). Precision, repeatability and stability experiments were studied and the relative standard deviations (RSD) of the extraction efficiency were 0.4246% (n = 3), 1.6057% (n = 3) and 1.6132% (n = 3), respectively. Conformation of BSA was not changed during the extraction process according to the investigation of UV–vis spectra, FT-IR spectra and CD spectra of BSA. The conductivity, dynamic light scattering (DLS) and transmission electron microscopy (TEM) were used to explore the mechanism of the extraction. It turned out that the formation of DES–protein aggregates play a significant role in the separation process. All the results suggest that ChCl-based DES-ATPS are supposed to have the potential to provide new possibilities in the separation of proteins

  2. Drawing-Based Procedural Modeling of Chinese Architectures.

    Science.gov (United States)

    Fei Hou; Yue Qi; Hong Qin

    2012-01-01

    This paper presents a novel modeling framework to build 3D models of Chinese architectures from elevation drawing. Our algorithm integrates the capability of automatic drawing recognition with powerful procedural modeling to extract production rules from elevation drawing. First, different from the previous symbol-based floor plan recognition, based on the novel concept of repetitive pattern trees, small horizontal repetitive regions of the elevation drawing are clustered in a bottom-up manner to form architectural components with maximum repetition, which collectively serve as building blocks for 3D model generation. Second, to discover the global architectural structure and its components' interdependencies, the components are structured into a shape tree in a top-down subdivision manner and recognized hierarchically at each level of the shape tree based on Markov Random Fields (MRFs). Third, shape grammar rules can be derived to construct 3D semantic model and its possible variations with the help of a 3D component repository. The salient contribution lies in the novel integration of procedural modeling with elevation drawing, with a unique application to Chinese architectures.

  3. Feasibility of bio-based lactate esters as extractant for biobutanol recovery: (Liquid + liquid) equilibria

    International Nuclear Information System (INIS)

    Zheng, Shaohua; Cheng, Hongye; Chen, Lifang; Qi, Zhiwen

    2016-01-01

    Highlights: • Lactate esters were studied as solvent to remove butanol from aqueous media. • (Liquid + liquid) equilibrium data were measured at T = 298.15 K and 1 atm. • Selectivity and 1-butanol partition coefficient were calculated. • COSMO-based study of separation efficiency on solvent structure was conducted. - Abstract: As bio-based solvents, lactate esters can be used as extractant for removing 1-butanol from the aqueous fermentation broths. In order to evaluate the separation efficiency of butyl lactate and 2-ethylhexyl lactate for the extraction of 1-butanol from its mixture with water, the (liquid + liquid) equilibrium for the ternary systems {water (1) + 1-butanol (2) + lactate ester (3)} were measured at T = 298.15 K. The 1-butanol partition coefficient varied in the range of 4.46 to 10.29, and the solvent selectivity within 32.12 to 108.18. For the separation of low-concentration butanol from fermentation broths, butyl lactate exhibits higher partition coefficient and lower selectivity than 2-ethylhexyl lactate. The NRTL model was employed to correlate the experimental data, and the COSMO-RS theory was utilized to predict the (liquid + liquid) equilibria and to analyze the influence of lactate esters on extraction efficiency.

  4. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  5. An Internet of Things Approach for Extracting Featured Data Using AIS Database: An Application Based on the Viewpoint of Connected Ships

    Directory of Open Access Journals (Sweden)

    Wei He

    2017-09-01

    Full Text Available Automatic Identification System (AIS, as a major data source of navigational data, is widely used in the application of connected ships for the purpose of implementing maritime situation awareness and evaluating maritime transportation. Efficiently extracting featured data from AIS database is always a challenge and time-consuming work for maritime administrators and researchers. In this paper, a novel approach was proposed to extract massive featured data from the AIS database. An Evidential Reasoning rule based methodology was proposed to simulate the procedure of extracting routes from AIS database artificially. First, the frequency distributions of ship dynamic attributes, such as the mean and variance of Speed over Ground, Course over Ground, are obtained, respectively, according to the verified AIS data samples. Subsequently, the correlations between the attributes and belief degrees of the categories are established based on likelihood modeling. In this case, the attributes were characterized into several pieces of evidence, and the evidence can be combined with the Evidential Reasoning rule. In addition, the weight coefficients were trained in a nonlinear optimization model to extract the AIS data more accurately. A real life case study was conducted at an intersection waterway, Yangtze River, Wuhan, China. The results show that the proposed methodology is able to extract data very precisely.

  6. Hybrid modelling framework by using mathematics-based and information-based methods

    International Nuclear Information System (INIS)

    Ghaboussi, J; Kim, J; Elnashai, A

    2010-01-01

    Mathematics-based computational mechanics involves idealization in going from the observed behaviour of a system into mathematical equations representing the underlying mechanics of that behaviour. Idealization may lead mathematical models that exclude certain aspects of the complex behaviour that may be significant. An alternative approach is data-centric modelling that constitutes a fundamental shift from mathematical equations to data that contain the required information about the underlying mechanics. However, purely data-centric methods often fail for infrequent events and large state changes. In this article, a new hybrid modelling framework is proposed to improve accuracy in simulation of real-world systems. In the hybrid framework, a mathematical model is complemented by information-based components. The role of informational components is to model aspects which the mathematical model leaves out. The missing aspects are extracted and identified through Autoprogressive Algorithms. The proposed hybrid modelling framework has a wide range of potential applications for natural and engineered systems. The potential of the hybrid methodology is illustrated through modelling highly pinched hysteretic behaviour of beam-to-column connections in steel frames.

  7. Driver drowsiness classification using fuzzy wavelet-packet-based feature-extraction algorithm.

    Science.gov (United States)

    Khushaba, Rami N; Kodagoda, Sarath; Lal, Sara; Dissanayake, Gamini

    2011-01-01

    Driver drowsiness and loss of vigilance are a major cause of road accidents. Monitoring physiological signals while driving provides the possibility of detecting and warning of drowsiness and fatigue. The aim of this paper is to maximize the amount of drowsiness-related information extracted from a set of electroencephalogram (EEG), electrooculogram (EOG), and electrocardiogram (ECG) signals during a simulation driving test. Specifically, we develop an efficient fuzzy mutual-information (MI)- based wavelet packet transform (FMIWPT) feature-extraction method for classifying the driver drowsiness state into one of predefined drowsiness levels. The proposed method estimates the required MI using a novel approach based on fuzzy memberships providing an accurate-information content-estimation measure. The quality of the extracted features was assessed on datasets collected from 31 drivers on a simulation test. The experimental results proved the significance of FMIWPT in extracting features that highly correlate with the different drowsiness levels achieving a classification accuracy of 95%-- 97% on an average across all subjects.

  8. Csf Based Non-Ground Points Extraction from LIDAR Data

    Science.gov (United States)

    Shen, A.; Zhang, W.; Shi, H.

    2017-09-01

    Region growing is a classical method of point cloud segmentation. Based on the idea of collecting the pixels with similar properties to form regions, region growing is widely used in many fields such as medicine, forestry and remote sensing. In this algorithm, there are two core problems. One is the selection of seed points, the other is the setting of the growth constraints, in which the selection of the seed points is the foundation. In this paper, we propose a CSF (Cloth Simulation Filtering) based method to extract the non-ground seed points effectively. The experiments have shown that this method can obtain a group of seed spots compared with the traditional methods. It is a new attempt to extract seed points

  9. Statistical properties of compartmental model parameters extracted from dynamic positron emission tomography experiments

    International Nuclear Information System (INIS)

    Mazoyer, B.M.; Huesman, R.H.; Budinger, T.F.; Knittel, B.L.

    1986-01-01

    Over the past years a major focus of research in physiologic studies employing tracers has been the computer implementation of mathematical methods of kinetic modeling for extracting the desired physiological parameters from tomographically derived data. A study is reported of factors that affect the statistical properties of compartmental model parameters extracted from dynamic positron emission tomography (PET) experiments

  10. Extracting conceptual models from user stories with Visual Narrator

    NARCIS (Netherlands)

    Lucassen, Garm; Robeer, Marcel; Dalpiaz, Fabiano; van der Werf, Jan Martijn E. M.; Brinkkemper, Sjaak

    2017-01-01

    Extracting conceptual models from natural language requirements can help identify dependencies, redundancies, and conflicts between requirements via a holistic and easy-to-understand view that is generated from lengthy textual specifications. Unfortunately, existing approaches never gained traction

  11. Modelling long-term oil price and extraction with a Hubbert approach: The LOPEX model

    International Nuclear Information System (INIS)

    Rehrl, Tobias; Friedrich, Rainer

    2006-01-01

    The LOPEX (Long-term Oil Price and EXtraction) model generates long-term scenarios about future world oil supply and corresponding price paths up to the year 2100. In order to determine oil production in non-OPEC countries, the model uses Hubbert curves. Hubbert curves reflect the logistic nature of the discovery process and the associated constraint on temporal availability of oil. Extraction paths and world oil price path are both derived endogenously from OPEC's intertemporally optimal cartel behaviour. Thereby OPEC is faced with both the price-dependent production of the non-OPEC competitive fringe and the price-dependent world oil demand. World oil demand is modelled with a constant price elasticity function and refers to a scenario from ACROPOLIS-POLES. LOPEX results indicate a significant higher oil price from around 2020 onwards compared to the reference scenario, and a stagnating market share of maximal 50% to be optimal for OPEC

  12. Gravitational wave extraction based on Cauchy-characteristic extraction and characteristic evolution

    Energy Technology Data Exchange (ETDEWEB)

    Babiuc, Maria [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Szilagyi, Bela [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Max-Planck-Institut fuer Gravitationsphysik, Albert-Einstein-Institut, Am Muehlenberg 1, D-14476 Golm (Germany); Hawke, Ian [Max-Planck-Institut fuer Gravitationsphysik, Albert-Einstein-Institut, Am Muehlenberg 1, D-14476 Golm (Germany); School of Mathematics, University of Southampton, Southampton SO17 1BJ (United Kingdom); Zlochower, Yosef [Department of Physics and Astronomy, and Center for Gravitational Wave Astronomy, University of Texas at Brownsville, Brownsville, TX 78520 (United States)

    2005-12-07

    We implement a code to find the gravitational news at future null infinity by using data from a Cauchy code as boundary data for a characteristic code. This technique of Cauchy-characteristic extraction (CCE) allows for the unambiguous extraction of gravitational waves from numerical simulations. We first test the technique on non-radiative spacetimes: Minkowski spacetime, perturbations of Minkowski spacetime and static black hole spacetimes in various gauges. We show the convergence and limitations of the algorithm and illustrate its success in cases where other wave extraction methods fail. We further apply our techniques to a standard radiative test case for wave extraction, a linearized Teukolsky wave, presenting our results in comparison to the Zerilli technique, and we argue for the advantages of our method of extraction.

  13. Extraction of multi-scale landslide morphological features based on local Gi* using airborne LiDAR-derived DEM

    Science.gov (United States)

    Shi, Wenzhong; Deng, Susu; Xu, Wenbing

    2018-02-01

    For automatic landslide detection, landslide morphological features should be quantitatively expressed and extracted. High-resolution Digital Elevation Models (DEMs) derived from airborne Light Detection and Ranging (LiDAR) data allow fine-scale morphological features to be extracted, but noise in DEMs influences morphological feature extraction, and the multi-scale nature of landslide features should be considered. This paper proposes a method to extract landslide morphological features characterized by homogeneous spatial patterns. Both profile and tangential curvature are utilized to quantify land surface morphology, and a local Gi* statistic is calculated for each cell to identify significant patterns of clustering of similar morphometric values. The method was tested on both synthetic surfaces simulating natural terrain and airborne LiDAR data acquired over an area dominated by shallow debris slides and flows. The test results of the synthetic data indicate that the concave and convex morphologies of the simulated terrain features at different scales and distinctness could be recognized using the proposed method, even when random noise was added to the synthetic data. In the test area, cells with large local Gi* values were extracted at a specified significance level from the profile and the tangential curvature image generated from the LiDAR-derived 1-m DEM. The morphologies of landslide main scarps, source areas and trails were clearly indicated, and the morphological features were represented by clusters of extracted cells. A comparison with the morphological feature extraction method based on curvature thresholds proved the proposed method's robustness to DEM noise. When verified against a landslide inventory, the morphological features of almost all recent (historical (> 10 years) landslides were extracted. This finding indicates that the proposed method can facilitate landslide detection, although the cell clusters extracted from curvature images should

  14. Considering extraction constraints in long-term oil price modelling

    Energy Technology Data Exchange (ETDEWEB)

    Rehrl, Tobias; Friedrich, Rainer; Voss, Alfred

    2005-12-15

    Apart from divergence about the remaining global oil resources, the peak oil discussion can be reduced to a dispute about the time rate at which these resources can be supplied. On the one hand it is problematic to project oil supply trends without taking both - prices as well as supply costs - explicitly into account. On the other hand are supply cost estimates however itself heavily dependent on the underlying extraction rates and are actually only valid within a certain business-as-usual extraction rate scenario (which itself is the task to determine). In fact, even after having applied enhanced recovery technologies, the rate at which an oil field can be exploited is quite restricted. Above a certain level an additional extraction rate increase can only be costly achieved at risks of losses in the overall recoverable amounts of the oil reservoir and causes much higher marginal cost. This inflexibility in extraction can be overcome in principle by the access to new oil fields. This indicates why the discovery trend may roughly form the long-term oil production curve, at least for price-taking suppliers. The long term oil discovery trend itself can be described as a logistic process with the two opposed effects of learning and depletion. This leads to the well-known Hubbert curve. Several attempts have been made to incorporate economic variables econometrically into the Hubbert model. With this work we follow a somewhat inverse approach and integrate Hubbert curves in our Long-term Oil Price and EXtraction model LOPEX. In LOPEX we assume that non-OPEC oil production - as long as the oil can be profitably discovered and extracted - is restricted to follow self-regulative discovery trends described by Hubbert curves. Non-OPEC production in LOPEX therefore consists of those Hubbert cycles that are profitable, depending on supply cost and price. Endogenous and exogenous technical progress is extra integrated in different ways. LOPEX determines extraction and price

  15. Considering extraction constraints in long-term oil price modelling

    International Nuclear Information System (INIS)

    Rehrl, Tobias; Friedrich, Rainer; Voss, Alfred

    2005-01-01

    Apart from divergence about the remaining global oil resources, the peak oil discussion can be reduced to a dispute about the time rate at which these resources can be supplied. On the one hand it is problematic to project oil supply trends without taking both - prices as well as supply costs - explicitly into account. On the other hand are supply cost estimates however itself heavily dependent on the underlying extraction rates and are actually only valid within a certain business-as-usual extraction rate scenario (which itself is the task to determine). In fact, even after having applied enhanced recovery technologies, the rate at which an oil field can be exploited is quite restricted. Above a certain level an additional extraction rate increase can only be costly achieved at risks of losses in the overall recoverable amounts of the oil reservoir and causes much higher marginal cost. This inflexibility in extraction can be overcome in principle by the access to new oil fields. This indicates why the discovery trend may roughly form the long-term oil production curve, at least for price-taking suppliers. The long term oil discovery trend itself can be described as a logistic process with the two opposed effects of learning and depletion. This leads to the well-known Hubbert curve. Several attempts have been made to incorporate economic variables econometrically into the Hubbert model. With this work we follow a somewhat inverse approach and integrate Hubbert curves in our Long-term Oil Price and EXtraction model LOPEX. In LOPEX we assume that non-OPEC oil production - as long as the oil can be profitably discovered and extracted - is restricted to follow self-regulative discovery trends described by Hubbert curves. Non-OPEC production in LOPEX therefore consists of those Hubbert cycles that are profitable, depending on supply cost and price. Endogenous and exogenous technical progress is extra integrated in different ways. LOPEX determines extraction and price

  16. Alkylsulfate-based ionic liquids in the liquid–liquid extraction of aromatic hydrocarbons

    International Nuclear Information System (INIS)

    García, Silvia; Larriba, Marcos; García, Julián; Torrecilla, José S.; Rodríguez, Francisco

    2012-01-01

    Highlights: ► Values of α 2,1 for the four R-SO 4 ionic liquids are higher than those of sulfolane. ► Values of D 2 for all the ionic liquids are lower than those of sulfolane. ► Values of D 2 for [emim][C 2 H 5 SO 4 ] are the highest among the R-SO 4 ionic liquids. - Abstract: The (liquid + liquid) equilibrium data (LLE) for the extraction of toluene from heptane with different ionic liquids (ILs) based on the alkylsulfate anion (R-SO 4 ) was determined at T = 313.2 K and atmospheric pressure. The effect of more complex R-SO 4 anions on capacity of extraction and selectivity in the liquid–liquid extraction of toluene from heptane was studied. The ternary systems were formed by {heptane + toluene + 1,3-dimethylimidazolium methylsulfate ([mmim][CH 3 SO 4 ]), 1-ethyl-3-methylimidazolium hydrogensulfate ([emim][HSO 4 ]), 1-ethyl-3-methylimidazolium methylsulfate ([emim][CH 3 SO 4 ]), or 1-ethyl-3-methylimidazolium ethylsulfate ([emim][C 2 H 5 SO 4 ])}. The degree of quality of the experimental LLE data was ascertained by applying the Othmer–Tobias correlation. The phase diagrams for the ternary systems were plotted, and the tie lines correlated with the NRTL model compare satisfactorily with the experimental data.

  17. Supercritical carbon dioxide extraction of pigments from Bixa orellana seeds (experiments and modeling

    Directory of Open Access Journals (Sweden)

    B. P. Nobre

    2006-06-01

    Full Text Available Supercritical CO2 extraction of the pigments from Bixa orellana seeds was carried out in a flow apparatus at a pressure of 200 bar and a temperature of 40 ºC at two fluid flow rates (0.67g/min and 1.12g/min. The efficiency of the extraction was low (only about 1% of the pigment was extracted. The increase in flow rate led to a decrease in pigment recovery. A large increase in recovery (from 1% to 45% was achieved using supercritical carbon dioxide with 5 mol % ethanol as extraction fluid at pressures of 200 and 300 bar and temperatures of 40 and 60 ºC. Although the increase in temperature and pressure led to an increase in recovery, the changes in flow rate did not seem to affect it. Furthermore, two plug flow models were applied to describe the supercritical extraction of the pigments from annatto seeds. Mass transfer coefficients were determined and compared well with those obtained by other researchers with similar models for the supercritical extraction of solutes from plant materials.

  18. Novel extraction induced by microemulsion breaking: a model study for Hg extraction from Brazilian gasoline.

    Science.gov (United States)

    Vicentino, Priscila O; Cassella, Ricardo J

    2017-01-01

    This paper proposes a novel approach for the extraction of Hg from Brazilian gasoline samples: extraction induced by microemulsion breaking (EIMB). In this approach, a microemulsion is formed by mixing the sample with n-propanol and HCl. Afterwards, the microemulsion is destabilized by the addition of water and the two phases are separated: (i) the top phase, containing the residual gasoline and (ii) the bottom phase, containing the extracted analyte in a medium containing water, n-propanol and the ethanol originally present in the gasoline sample. The bottom phase is then collected and the Hg is measured by cold vapor atomic absorption spectrometry (CV-AAS). This model study used Brazilian gasoline samples spiked with Hg (organometallic compound) to optimize the process. Under the optimum extraction conditions, the microemulsion was prepared by mixing 8.7mL of sample with 1.2mL of n-propanol and 0.1mL of a 10molL -1 HCl solution. Emulsion breaking was induced by adding 300µL of deionized water and the bottom phase was collected for the measurement of Hg. Six samples of Brazilian gasoline were spiked with Hg in the organometallic form and recovery percentages in the range of 88-109% were observed. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    Science.gov (United States)

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  20. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  1. An in vivo model to study the anti-malaric capacity of plant extracts

    Directory of Open Access Journals (Sweden)

    Misael Chinchilla

    1998-03-01

    Full Text Available An in vivo model to study the antimalaric effect of plant extracts is described. White mice (25-30g body weight are treated subcutaneously with 0.6ml of the diluted extract starting seven days before P. berghei infection; treatment continues until death or for 30 days. Simultaneously 0.2ml of the extract are applied per os starting three days before infection. In a test of the model, treated and non-treated animals differed in body weight, survival time, haematocrite, parasitemia development, and spleen or liver weight of recent dead or killed mice.

  2. Level Sets and Voronoi based Feature Extraction from any Imagery

    DEFF Research Database (Denmark)

    Sharma, O.; Anton, François; Mioc, Darka

    2012-01-01

    Polygon features are of interest in many GEOProcessing applications like shoreline mapping, boundary delineation, change detection, etc. This paper presents a unique new GPU-based methodology to automate feature extraction combining level sets, or mean shift based segmentation together with Voron...

  3. Building Contour Extraction Based on LiDAR Point Cloud

    Directory of Open Access Journals (Sweden)

    Zhang Xu-Qing

    2017-01-01

    Full Text Available This paper presents a new method for solving the problem of utilizing the LiDAR data to extract the building contour line. For detection of the edge points between the building test points by using the least squares fitting to get the edge line of buildings and give the weight determining of the building of edge line slope depend on the length of the edge line. And then get the weighted mean of the positive and negative slope of the building edge line. Based on the structure of the adjacent edge perpendicular hypothesis, regularization processing to extract the edge of the skeleton line perpendicular. The experiments show that the extracted building edges have the good accuracy and have the good applicability in complex urban areas.

  4. Automatic Target Recognition in Synthetic Aperture Sonar Images Based on Geometrical Feature Extraction

    Directory of Open Access Journals (Sweden)

    J. Del Rio Vera

    2009-01-01

    Full Text Available This paper presents a new supervised classification approach for automated target recognition (ATR in SAS images. The recognition procedure starts with a novel segmentation stage based on the Hilbert transform. A number of geometrical features are then extracted and used to classify observed objects against a previously compiled database of target and non-target features. The proposed approach has been tested on a set of 1528 simulated images created by the NURC SIGMAS sonar model, achieving up to 95% classification accuracy.

  5. Solvent extraction of gold using ionic liquid based process

    Science.gov (United States)

    Makertihartha, I. G. B. N.; Zunita, Megawati; Rizki, Z.; Dharmawijaya, P. T.

    2017-01-01

    In decades, many research and mineral processing industries are using solvent extraction technology for metal ions separation. Solvent extraction technique has been used for the purification of precious metals such as Au and Pd, and base metals such as Cu, Zn and Cd. This process uses organic compounds as solvent. Organic solvents have some undesired properties i.e. toxic, volatile, excessive used, flammable, difficult to recycle, low reusability, low Au recovery, together with the problems related to the disposal of spent extractants and diluents, even the costs associated with these processes are relatively expensive. Therefore, a lot of research have boosted into the development of safe and environmentally friendly process for Au separation. Ionic liquids (ILs) are the potential alternative for gold extraction because they possess several desirable properties, such as a the ability to expanse temperature process up to 300°C, good solvent properties for a wide range of metal ions, high selectivity, low vapor pressures, stability up to 200°C, easy preparation, environmentally friendly (commonly called as "green solvent"), and relatively low cost. This review paper is focused in investigate of some ILs that have the potentials as solvent in extraction of Au from mineral/metal alloy at various conditions (pH, temperature, and pressure). Performances of ILs extraction of Au are studied in depth, i.e. structural relationship of ILs with capability to separate Au from metal ions aggregate. Optimal extraction conditon in order to gain high percent of Au in mineral processing is also investigated.

  6. Application of keyword extraction on MOOC resources

    Directory of Open Access Journals (Sweden)

    Zhuoxuan Jiang

    2017-03-01

    Full Text Available Purpose – Recent years have witnessed the rapid development of massive open online courses (MOOCs. With more and more courses being produced by instructors and being participated by learners all over the world, unprecedented massive educational resources are aggregated. The educational resources include videos, subtitles, lecture notes, quizzes, etc., on the teaching side, and forum contents, Wiki, log of learning behavior, log of homework, etc., on the learning side. However, the data are both unstructured and diverse. To facilitate knowledge management and mining on MOOCs, extracting keywords from the resources is important. This paper aims to adapt the state-of-the-art techniques to MOOC settings and evaluate the effectiveness on real data. In terms of practice, this paper also tries to answer the questions for the first time that to what extend can the MOOC resources support keyword extraction models, and how many human efforts are required to make the models work well. Design/methodology/approach – Based on which side generates the data, i.e instructors or learners, the data are classified to teaching resources and learning resources, respectively. The approach used on teaching resources is based on machine learning models with labels, while the approach used on learning resources is based on graph model without labels. Findings – From the teaching resources, the methods used by the authors can accurately extract keywords with only 10 per cent labeled data. The authors find a characteristic of the data that the resources of various forms, e.g. subtitles and PPTs, should be separately considered because they have the different model ability. From the learning resources, the keywords extracted from MOOC forums are not as domain-specific as those extracted from teaching resources, but they can reflect the topics which are lively discussed in forums. Then instructors can get feedback from the indication. The authors implement two

  7. Javanese Character Feature Extraction Based on Shape Energy

    Directory of Open Access Journals (Sweden)

    Galih Hendra Wibowo

    2017-07-01

    Full Text Available Javanese character is one of Indonesia's noble culture, especially in Java. However, the number of Javanese people who are able to read the letter has decreased so that there need to be conservation efforts in the form of a system that is able to recognize the characters. One solution to these problem lies in Optical Character Recognition (OCR studies, where one of its heaviest points lies in feature extraction which is to distinguish each character. Shape Energy is one of feature extraction method with the basic idea of how the character can be distinguished simply through its skeleton. Based on the basic idea, then the development of feature extraction is done based on its components to produce an angular histogram with various variations of multiples angle. Furthermore, the performance test of this method and its basic method is performed in Javanese character dataset, which has been obtained from various images, is 240 data with 19 labels by using K-Nearest Neighbors as its classification method. Performance values were obtained based on the accuracy which is generated through the Cross-Validation process of 80.83% in the angular histogram with an angle of 20 degrees, 23% better than Shape Energy. In addition, other test results show that this method is able to recognize rotated character with the lowest performance value of 86% at 180-degree rotation and the highest performance value of 96.97% at 90-degree rotation. It can be concluded that this method is able to improve the performance of Shape Energy in the form of recognition of Javanese characters as well as robust to the rotation.

  8. Novel Fluorinated Tensioactive Extractant Combined with Flotation for Decontamination of Extractant Residual during Solvent Extraction

    Science.gov (United States)

    Wu, Xue; Chang, Zhidong; Liu, Yao; Choe, Chol Ryong

    2017-12-01

    Solvent-extraction is widely used in chemical industry. Due to the amphiphilic character, a large amount of extractant remains in water phase, which causes not only loss of reagent, but also secondary contamination in water phase. Novel fluorinated extractants with ultra-low solubility in water were regarded as effective choice to reduce extractant loss in aqueous phase. However, trace amount of extractant still remained in water. Based on the high tensioactive aptitude of fluorinated solvent, flotation was applied to separate fluorinated extractant remaining in raffinate. According to the data of surface tension measurement, the surface tension of solution was obviously decreased with the addition of fluorinated extractant tris(2,2,3,3,4,4,5,5-octafluoropentyl) phosphate (FTAP). After flotation, the FTAP dissolved in water can be removed as much as 70%, which proved the feasibility of this key idea. The effects of operation time, gas velocity, pH and salinity of bulk solution on flotation performance were discussed. The optimum operating parameters were determined as gas velocity of 12ml/min, operating time of 15min, pH of 8.7, and NaCl volume concentration of 1.5%, respectively. Moreover, adsorption process of FTAP on bubble surface was simulated by ANSYS VOF model using SIMPLE algorithm. The dynamic mechanism of flotation was also theoretically investigated, which can be considered as supplement to the experimental results.

  9. Combination of poroelasticity theory and constant strain rate test in modelling land subsidence due to groundwater extraction

    Science.gov (United States)

    Pham, Tien Hung; Rühaak, Wolfram; Sass, Ingo

    2017-04-01

    Extensive groundwater extraction leads to a drawdown of the ground water table. Consequently, soil effective stress increases and can cause land subsidence. Analysis of land subsidence generally requires a numerical model based on poroelasticity theory, which was first proposed by Biot (1941). In the review of regional land subsidence accompanying groundwater extraction, Galloway and Burbey (2011) stated that more research and application is needed in coupling of stress-dependent land subsidence process. In geotechnical field, the constant rate of strain tests (CRS) was first introduced in 1969 (Smith and Wahls 1969) and was standardized in 1982 through the designation D4186-82 by American Society for Testing and Materials. From the reading values of CRS tests, the stress-dependent parameters of poroelasticity model can be calculated. So far, there is no research to link poroelasticity theory with CRS tests in modelling land subsidence due to groundwater extraction. One dimensional CRS tests using conventional compression cell and three dimension CRS tests using Rowe cell were performed. The tests were also modelled by using finite element method with mixed elements. Back analysis technique is used to find the suitable values of hydraulic conductivity and bulk modulus that depend on the stress or void ratio. Finally, the obtained results are used in land subsidence models. Biot, M. A. (1941). "General theory of three-dimensional consolidation." Journal of applied physics 12(2): 155-164. Galloway, D. L. and T. J. Burbey (2011). "Review: Regional land subsidence accompanying groundwater extraction." Hydrogeology Journal 19(8): 1459-1486. Smith, R. E. and H. E. Wahls (1969). "Consolidation under constant rates of strain." Journal of Soil Mechanics & Foundations Div.

  10. Clear-sky classification procedures and models using a world-wide data-base

    International Nuclear Information System (INIS)

    Younes, S.; Muneer, T.

    2007-01-01

    Clear-sky data need to be extracted from all-sky measured solar-irradiance dataset, often by using algorithms that rely on other measured meteorological parameters. Current procedures for clear-sky data extraction have been examined and compared with each other to determine their reliability and location dependency. New clear-sky determination algorithms are proposed that are based on a combination of clearness index, diffuse ratio, cloud cover and Linke's turbidity limits. Various researchers have proposed clear-sky irradiance models that rely on synoptic parameters; four of these models, MRM, PRM, YRM and REST2 have been compared for six world-wide-locations. Based on a previously-developed comprehensive accuracy scoring method, the models MRM, REST2 and YRM were found to be of satisfactory performance in decreasing order. The so-called Page radiation model (PRM) was found to underestimate solar radiation, even though local turbidity data were provided for its operation

  11. Advancing Affect Modeling via Preference Learning and Unsupervised Feature Extraction

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez

    strategies (error functions and training algorithms) for artificial neural networks are examined across synthetic and psycho-physiological datasets, and compared against support vector machines and Cohen’s method. Results reveal the best training strategies for neural networks and suggest their superiority...... difficulties, ordinal reports such as rankings and ratings can yield more reliable affect annotations than alternative tools. This thesis explores preference learning methods to automatically learn computational models from ordinal annotations of affect. In particular, an extensive collection of training...... over the other examined methods. The second challenge addressed in this thesis refers to the extraction of relevant information from physiological modalities. Deep learning is proposed as an automatic approach to extract input features for models of affect from physiological signals. Experiments...

  12. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLOGICAL MATERIAL

    Science.gov (United States)

    This report describes the formulation, numerical development, and use of a multiphase, multicomponent, biodegradation model designed to simulate physical, chemical, and biological interactions occurring primarily in field scale soil vapor extraction (SVE) and bioventing (B...

  13. A Financial Data Mining Model for Extracting Customer Behavior

    Directory of Open Access Journals (Sweden)

    Mark K.Y. Mak

    2011-08-01

    Full Text Available Facing the problem of variation and chaotic behavior of customers, the lack of sufficient information is a challenge to many business organizations. Human analysts lacking an understanding of the hidden patterns in business data, thus, can miss corporate business opportunities. In order to embrace all business opportunities, enhance the competitiveness, discovery of hidden knowledge, unexpected patterns and useful rules from large databases have provided a feasible solution for several decades. While there is a wide range of financial analysis products existing in the financial market, how to customize the investment portfolio for the customer is still a challenge to many financial institutions. This paper aims at developing an intelligent Financial Data Mining Model (FDMM for extracting customer behavior in the financial industry, so as to increase the availability of decision support data and hence increase customer satisfaction. The proposed financial model first clusters the customers into several sectors, and then finds the correlation among these sectors. It is noted that better customer segmentation can increase the ability to identify targeted customers, therefore extracting useful rules for specific clusters can provide an insight into customers' buying behavior and marketing implications. To validate the feasibility of the proposed model, a simple dataset is collected from a financial company in Hong Kong. The simulation experiments show that the proposed method not only can improve the workflow of a financial company, but also deepen understanding of investment behavior. Thus, a corporation is able to customize the most suitable products and services for customers on the basis of the rules extracted.

  14. A Novel Model-Based Driving Behavior Recognition System Using Motion Sensors

    Directory of Open Access Journals (Sweden)

    Minglin Wu

    2016-10-01

    Full Text Available In this article, a novel driving behavior recognition system based on a specific physical model and motion sensory data is developed to promote traffic safety. Based on the theory of rigid body kinematics, we build a specific physical model to reveal the data change rule during the vehicle moving process. In this work, we adopt a nine-axis motion sensor including a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer, and apply a Kalman filter for noise elimination and an adaptive time window for data extraction. Based on the feature extraction guided by the built physical model, various classifiers are accomplished to recognize different driving behaviors. Leveraging the system, normal driving behaviors (such as accelerating, braking, lane changing and turning with caution and aggressive driving behaviors (such as accelerating, braking, lane changing and turning with a sudden can be classified with a high accuracy of 93.25%. Compared with traditional driving behavior recognition methods using machine learning only, the proposed system possesses a solid theoretical basis, performs better and has good prospects.

  15. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    R Smit (Rob)

    2011-01-01

    htmlabstractThis thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible

  16. Performance Modeling of Mimosa pudica Extract as a Sensitizer for Solar Energy Conversion

    Directory of Open Access Journals (Sweden)

    M. B. Shitta

    2016-01-01

    Full Text Available An organic material is proposed as a sustainable sensitizer and a replacement for the synthetic sensitizer in a dye-sensitized solar cell technology. Using the liquid extract from the leaf of a plant called Mimosa pudica (M. pudica as a sensitizer, the performance characteristics of the extract of M. pudica are investigated. The photo-anode of each of the solar cell sample is passivated with a self-assembly monolayer (SAM from a set of four materials, including alumina, formic acid, gelatine, and oxidized starch. Three sets of five samples of an M. pudica–based solar cell are produced, with the fifth sample used as the control experiment. Each of the solar cell samples has an active area of 0.3848cm2. A two-dimensional finite volume method (FVM is used to model the transport of ions within the monolayer of the solar cell. The performance of the experimentally fabricated solar cells compares qualitatively with the ones obtained from the literature and the simulated solar cells. The highest efficiency of 3% is obtained from the use of the extract as a sensitizer. It is anticipated that the comparison of the performance characteristics with further research on the concentration of M. pudica extract will enhance the development of a reliable and competitive organic solar cell. It is also recommended that further research should be carried out on the concentration of the extract and electrolyte used in this study for a possible improved performance of the cell.

  17. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  18. MODELING OF SUPERCRITICAL FLUID EXTRACTION KINETIC OF FLAXSEED OIL BY DIFFUSION CONTROL METHOD

    Directory of Open Access Journals (Sweden)

    Emir Zafer HOŞGÜN

    2013-06-01

    Full Text Available In this study, Flaxseed oil was extracted by Supercritical Carbondioxide Extraction, and extractionkinetics was modelled using diffusion controlled method.The effect of process parameters, such as pressure (20, 35, 55 MPa, temperature (323 and 343 K, and CO2 flow rate (1 and 3 L CO2 /min on the extraction yield and effective diffusivity (De was investigated. The effective diffusion coefficient varied between 2.4 x10-12 and 10.8 x10-12 m2s-1 for the entire range of experiments and increased with the pressure and flow rate. The model fitted well theexperimental data (ADD varied between 2.35 and 7.48%.

  19. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    Science.gov (United States)

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data.

  20. Ionic liquid-based microwave-assisted extraction of rutin from Chinese medicinal plants.

    Science.gov (United States)

    Zeng, Huan; Wang, Yuzhi; Kong, Jinhuan; Nie, Chan; Yuan, Ya

    2010-12-15

    An ionic liquid-based microwave-assisted extraction (ILMAE) method has been developed for the effective extraction of rutin from Chinese medicinal plants including Saururus chinensis (Lour.) Bail. (S. chinensis) and Flos Sophorae. A series of 1-butyl-3-methylimidazolium ionic liquids with different anions were investigated. The results indicated that the characteristics of anions have remarkable effects on the extraction efficiency of rutin and among the investigated ionic liquids, 1-butyl-3-methylimidazolium bromide ([bmim]Br) aqueous solution was the best. In addition, the ILMAE procedures for the two kinds of medicinal herbs were also optimized by means of a series of single factor experiments and an L(9) (3(4)) orthogonal design. Compared with the optimal ionic liquid-based heating extraction (ILHE), marinated extraction (ILME), ultrasonic-assisted extraction (ILUAE), the optimized approach of ILMAE gained higher extraction efficiency which is 4.879 mg/g in S. chinensis with RSD 1.33% and 171.82 mg/g in Flos Sophorae with RSD 1.47% within the shortest extraction time. Reversed phase high performance liquid chromatography (RP-HPLC) with ultraviolet detection was employed for the analysis of rutin in Chinese medicinal plants. Under the optimum conditions, the average recoveries of rutin from S. chinensis and Flos Sophorae were 101.23% and 99.62% with RSD lower than 3%, respectively. The developed approach is linear at concentrations from 42 to 252 mg L(-1) of rutin solution, with the regression coefficient (r) at 0.99917. Moreover, the extraction mechanism of ILMAE and the microstructures and chemical structures of the two researched samples before and after extraction were also investigated. With the help of LC-MS, it was future demonstrated that the two researched herbs do contain active ingredient of rutin and ionic liquids would not influence the structure of rutin. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Pyridinium ionic liquid-based liquid-solid extraction of inorganic and organic iodine from Laminaria.

    Science.gov (United States)

    Peng, Li-Qing; Yu, Wen-Yan; Xu, Jing-Jing; Cao, Jun

    2018-01-15

    A simple, green and effective extraction method, namely, pyridinium ionic liquid- (IL) based liquid-solid extraction (LSE), was first designed to extract the main inorganic and organic iodine compounds (I - , monoiodo-tyrosine (MIT) and diiodo-tyrosine (DIT)). The optimal extraction conditions were as follows: ultrasonic intensity 100W, IL ([EPy]Br) concentration 200mM, extraction time 30min, liquid/solid ratio 10mL/g, and pH value 6.5. The morphologies of Laminaria were studied by scanning electron microscopy and transmission electron microscopy. The recovery values of I - , MIT and DIT from Laminaria were in the range of 88% to 94%, and limits of detection were in the range of 59.40 to 283.6ng/g. The proposed method was applied to the extraction and determination of iodine compounds in three Laminaria. The results showed that IL-based LSE could be a promising method for rapid extraction of bioactive iodine from complex food matrices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Accelerated H-LBP-based edge extraction method for digital radiography

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, Shuang; Zhao, Chen-yi; Huang, Ji-peng [School of Physics, Northeast Normal University, Changchun 130024 (China); Sun, Jia-ning, E-mail: sunjn118@nenu.edu.cn [School of Mathematics and Statistics, Northeast Normal University, Changchun 130024 (China)

    2015-01-11

    With the goal of achieving real time and efficient edge extraction for digital radiography, an accelerated H-LBP-based edge extraction method (AH-LBP) is presented in this paper by improving the existing framework of local binary pattern with the H function (H-LBP). Since the proposed method avoids computationally expensive operations with no loss of quality, it possesses much lower computational complexity than H-LBP. Experimental results on real radiographies show desirable performance of our method. - Highlights: • An accelerated H-LBP method for edge extraction on digital radiography is proposed. • The novel AH-LBP relies on numerical analysis of the existing H-LBP method. • Aiming at accelerating, H-LBP is reformulated as a direct binary processing. • AH-LBP provides the same edge extraction result as H-LBP does. • AH-LBP has low computational complexity satisfying real time requirements.

  3. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents.

    Science.gov (United States)

    Zhang, Jing; Lo, Joseph Y; Kuzmiak, Cherie M; Ghate, Sujata V; Yoon, Sora C; Mazurowski, Maciej A

    2014-09-01

    Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. The authors' algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different from 0.5 (perror

  4. Targeting Colorectal Cancer Proliferation, Stemness and Metastatic Potential Using Brassicaceae Extracts Enriched in Isothiocyanates: A 3D Cell Model-Based Study

    Science.gov (United States)

    Pereira, Lucília P.; Silva, Patrícia; Duarte, Marlene; Rodrigues, Liliana; Duarte, Catarina M. M.; Albuquerque, Cristina; Serra, Ana Teresa

    2017-01-01

    Colorectal cancer (CRC) recurrence is often attributable to circulating tumor cells and/or cancer stem cells (CSCs) that resist to conventional therapies and foster tumor progression. Isothiocyanates (ITCs) derived from Brassicaceae vegetables have demonstrated anticancer effects in CRC, however little is known about their effect in CSCs and tumor initiation properties. Here we examined the effect of ITCs-enriched Brassicaceae extracts derived from watercress and broccoli in cell proliferation, CSC phenotype and metastasis using a previously developed three-dimensional HT29 cell model with CSC-like traits. Both extracts were phytochemically characterized and their antiproliferative effect in HT29 monolayers was explored. Next, we performed cell proliferation assays and flow cytometry analysis in HT29 spheroids treated with watercress and broccoli extracts and respective main ITCs, phenethyl isothiocyanate (PEITC) and sulforaphane (SFN). Soft agar assays and relative quantitative expression analysis of stemness markers and Wnt/β-catenin signaling players were performed to evaluate the effect of these phytochemicals in stemness and metastasis. Our results showed that both Brassicaceae extracts and ITCs exert antiproliferative effects in HT29 spheroids, arresting cell cycle at G2/M, possibly due to ITC-induced DNA damage. Colony formation and expression of LGR5 and CD133 cancer stemness markers were significantly reduced. Only watercress extract and PEITC decreased ALDH1 activity in a dose-dependent manner, as well as β-catenin expression. Our research provides new insights on CRC therapy using ITC-enriched Brassicaceae extracts, specially watercress extract, to target CSCs and circulating tumor cells by impairing cell proliferation, ALDH1-mediated chemo-resistance, anoikis evasion, self-renewal and metastatic potential. PMID:28394276

  5. Targeting Colorectal Cancer Proliferation, Stemness and Metastatic Potential Using Brassicaceae Extracts Enriched in Isothiocyanates: A 3D Cell Model-Based Study

    Directory of Open Access Journals (Sweden)

    Lucília P. Pereira

    2017-04-01

    Full Text Available Colorectal cancer (CRC recurrence is often attributable to circulating tumor cells and/or cancer stem cells (CSCs that resist to conventional therapies and foster tumor progression. Isothiocyanates (ITCs derived from Brassicaceae vegetables have demonstrated anticancer effects in CRC, however little is known about their effect in CSCs and tumor initiation properties. Here we examined the effect of ITCs-enriched Brassicaceae extracts derived from watercress and broccoli in cell proliferation, CSC phenotype and metastasis using a previously developed three-dimensional HT29 cell model with CSC-like traits. Both extracts were phytochemically characterized and their antiproliferative effect in HT29 monolayers was explored. Next, we performed cell proliferation assays and flow cytometry analysis in HT29 spheroids treated with watercress and broccoli extracts and respective main ITCs, phenethyl isothiocyanate (PEITC and sulforaphane (SFN. Soft agar assays and relative quantitative expression analysis of stemness markers and Wnt/β-catenin signaling players were performed to evaluate the effect of these phytochemicals in stemness and metastasis. Our results showed that both Brassicaceae extracts and ITCs exert antiproliferative effects in HT29 spheroids, arresting cell cycle at G2/M, possibly due to ITC-induced DNA damage. Colony formation and expression of LGR5 and CD133 cancer stemness markers were significantly reduced. Only watercress extract and PEITC decreased ALDH1 activity in a dose-dependent manner, as well as β-catenin expression. Our research provides new insights on CRC therapy using ITC-enriched Brassicaceae extracts, specially watercress extract, to target CSCs and circulating tumor cells by impairing cell proliferation, ALDH1-mediated chemo-resistance, anoikis evasion, self-renewal and metastatic potential.

  6. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Anticonvulsant activity of Aloe vera leaf extract in acute and chronic models of epilepsy in mice.

    Science.gov (United States)

    Rathor, Naveen; Arora, Tarun; Manocha, Sachin; Patil, Amol N; Mediratta, Pramod K; Sharma, Krishna K

    2014-03-01

    The effect of Aloe vera in epilepsy has not yet been explored. This study was done to explore the effect of aqueous extract of Aloe vera leaf powder on three acute and one chronic model of epilepsy. In acute study, aqueous extract of Aloe vera leaf (extract) powder was administered in doses 100, 200 and 400 mg/kg p.o. Dose of 400 mg/kg of Aloe vera leaf extract was chosen for chronic administration. Oxidative stress parameters viz. malondialdehyde (MDA) and reduced glutathione (GSH) were also estimated in brain of kindled animals. In acute study, Aloe vera leaf (extract) powder in a dose-dependent manner significantly decreased duration of tonic hind limb extension in maximal electroshock seizure model, increased seizure threshold current in increasing current electroshock seizure model, and increased latency to onset and decreased duration of clonic convulsion in pentylenetetrazole (PTZ) model as compared with control group. In chronic study, Aloe vera leaf (extract) powder prevented progression of kindling in PTZ-kindled mice. Aloe vera leaf (extract) powder 400 mg/kg p.o. also reduced brain levels of MDA and increased GSH levels as compared to the PTZ-kindled non-treated group. The results of study showed that Aloe vera leaf (extract) powder possessed significant anticonvulsant and anti-oxidant activity. © 2013 Royal Pharmaceutical Society.

  8. Text extraction method for historical Tibetan document images based on block projections

    Science.gov (United States)

    Duan, Li-juan; Zhang, Xi-qun; Ma, Long-long; Wu, Jian

    2017-11-01

    Text extraction is an important initial step in digitizing the historical documents. In this paper, we present a text extraction method for historical Tibetan document images based on block projections. The task of text extraction is considered as text area detection and location problem. The images are divided equally into blocks and the blocks are filtered by the information of the categories of connected components and corner point density. By analyzing the filtered blocks' projections, the approximate text areas can be located, and the text regions are extracted. Experiments on the dataset of historical Tibetan documents demonstrate the effectiveness of the proposed method.

  9. Caustic-Side Solvent-Extraction Modeling for Hanford Interim Pretreatment System

    International Nuclear Information System (INIS)

    Moyer, B.A.; Birdwell, J.F.; Delmau, L. H.; McFarlane, J.

    2008-01-01

    The purpose of this work is to examine the applicability of the Caustic-Side Solvent Extraction (CSSX) process for the removal of cesium from Hanford tank-waste supernatant solutions in support of the Hanford Interim Pretreatment System (IPS). The Hanford waste types are more challenging than those at the Savannah River Site (SRS) in that they contain significantly higher levels of potassium, the chief competing ion in the extraction of cesium. It was confirmed by use of the CSSX model that the higher levels of potassium depress the cesium distribution ratio (DCs), as validated by measurement of DCs values for four of eight specified Hanford waste-simulant compositions. The model predictions were good to an apparent standard error of ±11%. It is concluded from batch distribution experiments, physical-property measurements, equilibrium modeling, flowsheet calculations, and contactor sizing that the CSSX process as currently employed for cesium removal from alkaline salt waste at the SRS is capable of treating similar Hanford tank feeds. For the most challenging waste composition, 41 stages would be required to provide a cesium decontamination factor (DF) of 5000 and a concentration factor (CF) of 5. Commercial contacting equipment with rotor diameters of 10 in. for extraction and 5 in. for stripping should have the capacity to meet throughput requirements, but testing will be required to confirm that the needed efficiency and hydraulic performance are actually obtainable. Markedly improved flowsheet performance was calculated for a new solvent formulation employing the more soluble cesium extractant BEHBCalixC6 used with alternative scrub and strip solutions, respectively 0.1 M NaOH and 10 mM boric acid. The improved system can meet minimum requirements (DF = 5000 and CF = 5) with 17 stages or more ambitious goals (DF = 40,000 and CF = 15) with 19 stages. Potential benefits of further research and development are identified that would lead to reduced costs, greater

  10. Feature Extraction from 3D Point Cloud Data Based on Discrete Curves

    Directory of Open Access Journals (Sweden)

    Yi An

    2013-01-01

    Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.

  11. Interactions Between Flavonoid-Rich Extracts and Sodium Caseinate Modulate Protein Functionality and Flavonoid Bioaccessibility in Model Food Systems.

    Science.gov (United States)

    Elegbede, Jennifer L; Li, Min; Jones, Owen G; Campanella, Osvaldo H; Ferruzzi, Mario G

    2018-05-01

    With growing interest in formulating new food products with added protein and flavonoid-rich ingredients for health benefits, direct interactions between these ingredient classes becomes critical in so much as they may impact protein functionality, product quality, and flavonoids bioavailability. In this study, sodium caseinate (SCN)-based model products (foams and emulsions) were formulated with grape seed extract (GSE, rich in galloylated flavonoids) and green tea extract (GTE, rich in nongalloylated flavonoids), respectively, to assess changes in functional properties of SCN and impacts on flavonoid bioaccessibility. Experiments with pure flavonoids suggested that galloylated flavonoids reduced air-water interfacial tension of 0.01% SCN dispersions more significantly than nongalloylated flavonoids at high concentrations (>50 μg/mL). This observation was supported by changes in stability of 5% SCN foam, which showed that foam stability was increased at high levels of GSE (≥50 μg/mL, P < 0.05) but was not affected by GTE. However, flavonoid extracts had modest effects on SCN emulsion. In addition, galloylated flavonoids had higher bioaccessibility in both SCN foam and emulsion. These results suggest that SCN-flavonoid binding interactions can modulate protein functionality leading to difference in performance and flavonoid bioaccessibility of protein-based products. As information on the beneficial health effects of flavonoids expands, it is likely that usage of these ingredients in consumer foods will increase. However, the necessary levels to provide such benefits may exceed those that begin to impact functionality of the macronutrients such as proteins. Flavonoid inclusion within protein matrices may modulate protein functionality in a food system and modify critical consumer traits or delivery of these beneficial plant-derived components. The product matrices utilized in this study offer relevant model systems to evaluate how fortification with flavonoid

  12. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    Science.gov (United States)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  13. Affinity extraction of emerging contaminants from water based on bovine serum albumin as a binding agent.

    Science.gov (United States)

    Papastavros, Efthimia; Remmers, Rachael A; Snow, Daniel D; Cassada, David A; Hage, David S

    2018-03-01

    Affinity sorbents using bovine serum albumin as a binding agent were developed and tested for the extraction of environmental contaminants from water. Computer simulations based on a countercurrent distribution model were also used to study the behavior of these sorbents. Several model drugs, pesticides, and hormones of interest as emerging contaminants were considered in this work, with carbamazepine being used as a representative analyte when coupling the albumin column on-line with liquid chromatography and tandem mass spectrometry. The albumin column was found to be capable of extracting carbamazepine from aqueous solutions that contained trace levels of this analyte. Further studies of the bovine serum albumin sorbent indicated that it had higher retention under aqueous conditions than a traditional C 18 support for most of the tested emerging contaminants. Potential advantages of using these protein-based sorbents included the low cost of bovine serum albumin and its ability to bind to a relatively wide range of drugs and related compounds. It was also shown how simulations could be used to describe the elution behavior of the model compounds on the bovine serum albumin sorbents as an aid in optimizing the retention and selectivity of these supports for use with liquid chromatography or methods such as liquid chromatography with tandem mass spectrometry. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPOR EXTRACTION AND BIOVENTING OF ORGANIC CHEMICALS IN UNSATURATED GEOLOGICAL MATERIAL

    Science.gov (United States)

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  15. Numerical Investigation on the Heat Extraction Capacity of Dual Horizontal Wells in Enhanced Geothermal Systems Based on the 3-D THM Model

    Directory of Open Access Journals (Sweden)

    Zhixue Sun

    2018-01-01

    Full Text Available The Enhanced Geothermal System (EGS constructs an artificial thermal reservoir by hydraulic fracturing to extract heat economically from hot dry rock. As the core element of the EGS heat recovery process, mass and heat transfer of working fluid mainly occurs in fractures. Since the direction of the natural and induced fractures are generally perpendicular to the minimum principal stress in the formation, as an effective stimulation approach, horizontal well production could increase the contact area with the thermal reservoir significantly. In this paper, the thermal reservoir is developed by a dual horizontal well system and treated as a fractured porous medium composed of matrix rock and discrete fracture network. Using the local thermal non-equilibrium theory, a coupled THM mathematical model and an ideal 3D numerical model are established for the EGS heat extraction process. EGS heat extraction capacity is evaluated in the light of thermal recovery lifespan, average outlet temperature, heat production, electricity generation, energy efficiency and thermal recovery rate. The results show that with certain reservoir and production parameters, the heat production, electricity generation and thermal recovery lifespan can achieve the commercial goal of the dual horizontal well system, but the energy efficiency and overall thermal recovery rate are still at low levels. At last, this paper puts forward a series of optimizations to improve the heat extraction capacity, including production conditions and thermal reservoir construction design.

  16. An investigation of paper based microfluidic devices for size based separation and extraction applications.

    Science.gov (United States)

    Zhong, Z W; Wu, R G; Wang, Z P; Tan, H L

    2015-09-01

    Conventional microfluidic devices are typically complex and expensive. The devices require the use of pneumatic control systems or highly precise pumps to control the flow in the devices. This work investigates an alternative method using paper based microfluidic devices to replace conventional microfluidic devices. Size based separation and extraction experiments conducted were able to separate free dye from a mixed protein and dye solution. Experimental results showed that pure fluorescein isothiocyanate could be separated from a solution of mixed fluorescein isothiocyanate and fluorescein isothiocyanate labeled bovine serum albumin. The analysis readings obtained from a spectrophotometer clearly show that the extracted tartrazine sample did not contain any amount of Blue-BSA, because its absorbance value was 0.000 measured at a wavelength of 590nm, which correlated to Blue-BSA. These demonstrate that paper based microfluidic devices, which are inexpensive and easy to implement, can potentially replace their conventional counterparts by the use of simple geometry designs and the capillary action. These findings will potentially help in future developments of paper based microfluidic devices. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  18. A green deep eutectic solvent-based aqueous two-phase system for protein extracting.

    Science.gov (United States)

    Xu, Kaijia; Wang, Yuzhi; Huang, Yanhua; Li, Na; Wen, Qian

    2015-03-15

    As a new type of green solvent, deep eutectic solvent (DES) has been applied for the extraction of proteins with an aqueous two-phase system (ATPS) in this work. Four kinds of choline chloride (ChCl)-based DESs were synthesized to extract bovine serum albumin (BSA), and ChCl-glycerol was selected as the suitable extraction solvent. Single factor experiments have been done to investigate the effects of the extraction process, including the amount of DES, the concentration of salt, the mass of protein, the shaking time, the temperature and PH value. Experimental results show 98.16% of the BSA could be extracted into the DES-rich phase in a single-step extraction under the optimized conditions. A high extraction efficiency of 94.36% was achieved, while the conditions were applied to the extraction of trypsin (Try). Precision, repeatability and stability experiments were studied and the relative standard deviations (RSD) of the extraction efficiency were 0.4246% (n=3), 1.6057% (n=3) and 1.6132% (n=3), respectively. Conformation of BSA was not changed during the extraction process according to the investigation of UV-vis spectra, FT-IR spectra and CD spectra of BSA. The conductivity, dynamic light scattering (DLS) and transmission electron microscopy (TEM) were used to explore the mechanism of the extraction. It turned out that the formation of DES-protein aggregates play a significant role in the separation process. All the results suggest that ChCl-based DES-ATPS are supposed to have the potential to provide new possibilities in the separation of proteins. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Effect of cellulose-based fibers extracted from pineapple (Ananas ...

    African Journals Online (AJOL)

    New polyurethane foams were fabricated utilizing cellulose-based fibers extracted from pineapple (Ananas comosus) leaf as raw material. The pineapple leaf fibers (PALF) were treated with alkali and subsequently bleached to enhance its fiber-matrix adhesion. Polyurethane composites have been prepared by ...

  20. On-chip concentration of bacteria using a 3D dielectrophoretic chip and subsequent laser-based DNA extraction in the same chip

    International Nuclear Information System (INIS)

    Cho, Yoon-Kyoung; Kim, Tae-hyeong; Lee, Jeong-Gun

    2010-01-01

    We report the on-chip concentration of bacteria using a dielectrophoretic (DEP) chip with 3D electrodes and subsequent laser-based DNA extraction in the same chip. The DEP chip has a set of interdigitated Au post electrodes with 50 µm height to generate a network of non-uniform electric fields for the efficient trapping by DEP. The metal post array was fabricated by photolithography and subsequent Ni and Au electroplating. Three model bacteria samples (Escherichia coli, Staphylococcus epidermidis, Streptococcus mutans) were tested and over 80-fold concentrations were achieved within 2 min. Subsequently, on-chip DNA extraction from the concentrated bacteria in the 3D DEP chip was performed by laser irradiation using the laser-irradiated magnetic bead system (LIMBS) in the same chip. The extracted DNA was analyzed with silicon chip-based real-time polymerase chain reaction (PCR). The total process of on-chip bacteria concentration and the subsequent DNA extraction can be completed within 10 min including the manual operation time.

  1. Organization of extracting molecules of the diamide type: link with the extracting properties?

    International Nuclear Information System (INIS)

    Meridiano, Y.

    2009-02-01

    The aim of these studies is to establish a link between the different organizations of diamide extractants (used in the DIAMEX process) and their extracting properties. The effects of the key parameters leading the liquid-liquid extraction (concentration of extractant, nature of solute, activity of the aqueous phase, nature of the diluent and temperature) are studied: 1) at the supramolecular scale, with the characterization of the extractant organizations by vapor-pressure osmometry (VPO) and small angle neutron and X-ray scattering (SANS/SAXS) experiments; 2) at the molecular scale, with the quantification of the extracted solutes (water, nitric acid, metal nitrate) and the determination of extracted complexes stoichiometries by electro-spray mass spectrometry (ESI-MS) experiments. The DMDOHEMA molecule acts as a classical surfactant and forms aggregates of the reverse micelle type. Taking into account the established supramolecular diagrams, a quantitative link between the extractants structures and their extracting properties has been brought to light. To model the europium nitrate extraction, two approaches have been developed: - an approach based on mass action laws. Extractions equilibria have been proposed taking into account the supramolecular speciation; - an innovative approach considering the extracted ions as adsorbed on a specific surface of the extractant molecule which depends on the extractant organization state. The ion extraction can be considered as a sum of isotherms corresponding to the different states of organization. This approach allows to compare the extraction efficiency of an extracting molecule as a function of its organization state. (author)

  2. Structured prediction models for RNN based sequence labeling in clinical text.

    Science.gov (United States)

    Jagannatha, Abhyuday N; Yu, Hong

    2016-11-01

    Sequence labeling is a widely used method for named entity recognition and information extraction from unstructured natural language data. In clinical domain one major application of sequence labeling involves extraction of medical entities such as medication, indication, and side-effects from Electronic Health Record narratives. Sequence labeling in this domain, presents its own set of challenges and objectives. In this work we experimented with various CRF based structured learning models with Recurrent Neural Networks. We extend the previously studied LSTM-CRF models with explicit modeling of pairwise potentials. We also propose an approximate version of skip-chain CRF inference with RNN potentials. We use these methodologies for structured prediction in order to improve the exact phrase detection of various medical entities.

  3. Evaluation of anti-epileptic activity of leaf extracts of Punica granatum on experimental models of epilepsy in mice.

    Science.gov (United States)

    Viswanatha, Gollapalle L; Venkataranganna, Marikunte V; Prasad, Nunna Bheema Lingeswara; Ashok, Godavarthi

    2016-01-01

    This study was aimed to examine the anti-epileptic activity of leaf extracts of Punica granatum in experimental models of epilepsy in Swiss albino mice. Petroleum ether leaf extract of P. granatum (PLPG), methanolic LPG (MLPG), and aqueous LPG (ALPG) extracts of P. granatum leaves was initially evaluated against 6-Hz-induced seizure model; the potent extract was further evaluated against maximal electroshock (MES) and pentylenetetrazole (PTZ)-induced convulsions. Further, the potent extract was evaluated for its influence on Gamma amino butyric acid (GABA) levels in brain, to explore the possible mechanism of action. In addition, the potent extract was subjected to actophotometer test to assess its possible locomotor activity deficit inducing action. In 6-Hz seizure test, the MLPG has alleviated 6-Hz-induced seizures significantly and dose dependently at doses 50, 100, 200, and 400 mg/kg. In contrast, PLPG and ALPG did not show any protection, only high dose of ALPG (400 and 800 mg/kg, p.o.) showed very slight inhibition. Based on these observations, only MLPG was tested in MES and PTZ models. Interestingly, the MLPG (50, 100, 200 and 400 mg/kg) has offered significant and dose-dependent protection against MES ( P < 0.01) and PTZ-induced ( P < 0.01) seizures in mice. Further, MLPG showed a significant increase in brain GABA levels ( P < 0.01) compared to control and showed insignificant change in locomotor activity in all tested doses (100, 200 and 400 mg/kg). Interestingly, higher dose of MLPG (400 mg/kg, p.o.) and Diazepam (5 mg/mg, p.o.) have completely abolished the convulsions in all the anticonvulsant tests. These findings suggest that MLPG possesses significant anticonvulsant property, and one of the possible mechanisms behind the anticonvulsant activity of MLPG may be through enhanced GABA levels in the brain.

  4. Model-assisted template extraction SRAF application to contact holes patterns in high-end flash memory device fabrication

    Science.gov (United States)

    Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha

    2018-03-01

    Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.

  5. Preliminary study: kinetics of oil extraction from sandalwood by microwave-assisted hydrodistillation

    Science.gov (United States)

    Kusuma, H. S.; Mahfud, M.

    2016-04-01

    Sandalwood and its oil, is one of the oldest known perfume materials and has a long history (more than 4000 years) of use as mentioned in Sanskrit manuscripts. Sandalwood oil plays an important role as an export commodity in many countries and its widely used in the food, perfumery and pharmaceuticals industries. The aim of this study is to know and verify the kinetics and mechanism of microwave-assisted hydrodistillation of sandalwood based on a second-order model. In this study, microwave-assisted hydrodistillation is used to extract essential oils from sandalwood. The extraction was carried out in ten extraction cycles of 15 min to 2.5 hours. The initial extraction rate, the extraction capacity and the second-order extraction rate constant were calculated using the model. Kinetics of oil extraction from sandalwood by microwave-assisted hydrodistillation proved that the extraction process was based on the second-order extraction model as the experimentally done in three different steps. The initial extraction rate, h, was 0.0232 g L-1 min-1, the extraction capacity, C S, was 0.6015 g L-1, the second-order extraction rate constant, k, was 0.0642 L g-1 min-1 and coefficient of determination, R 2, was 0.9597.

  6. A Method for Extracting Suspected Parotid Lesions in CT Images using Feature-based Segmentation and Active Contours based on Stationary Wavelet Transform

    Science.gov (United States)

    Wu, T. Y.; Lin, S. F.

    2013-10-01

    Automatic suspected lesion extraction is an important application in computer-aided diagnosis (CAD). In this paper, we propose a method to automatically extract the suspected parotid regions for clinical evaluation in head and neck CT images. The suspected lesion tissues in low contrast tissue regions can be localized with feature-based segmentation (FBS) based on local texture features, and can be delineated with accuracy by modified active contour models (ACM). At first, stationary wavelet transform (SWT) is introduced. The derived wavelet coefficients are applied to derive the local features for FBS, and to generate enhanced energy maps for ACM computation. Geometric shape features (GSFs) are proposed to analyze each soft tissue region segmented by FBS; the regions with higher similarity GSFs with the lesions are extracted and the information is also applied as the initial conditions for fine delineation computation. Consequently, the suspected lesions can be automatically localized and accurately delineated for aiding clinical diagnosis. The performance of the proposed method is evaluated by comparing with the results outlined by clinical experts. The experiments on 20 pathological CT data sets show that the true-positive (TP) rate on recognizing parotid lesions is about 94%, and the dimension accuracy of delineation results can also approach over 93%.

  7. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  8. Investigation of antineoplastic activity of chewing tablets based on dry oat extract and quercetin

    Directory of Open Access Journals (Sweden)

    Ярослав Ростиславович Андрійчук

    2015-07-01

    Full Text Available One of the main goals of domestic pharmaceutical science is development of new medicines. Thus, new tablet drug was created based on dry oat extract and quercetin. Investigation of antineoplastic activity was performed. Antineoplastic activity of investigational drug based on dry oat extract and quercetin was experimentally proved.

  9. Microcontroller based two axis microtron beam extraction system

    International Nuclear Information System (INIS)

    Ashoka, H.; Jathar, M.; Meshram, V.; Rao, Nageswara

    2009-01-01

    Microtron is an electron accelerator which is used to accelerate the electron beam. The Microtron consists of electro magnet with two poles separated by yoke for completion of path for magnetic flux lines. A compact Microtron capable of accelerating electrons up to 12 MeV has been developed in RRCAT. The beam from the Microtron has to be extracted from various orbits depending upon the user requirement (X-Y stage is built with an accuracy of 100 μm). This paper describes the design and development of microcontroller based two axis beam extraction system for Microtron, with a resolution of 50 μm to position the extraction tube with respect to selected orbit. Two axis motion controller is developed using current controlled micro-stepping driver mechanism, which uses Bipolar Chopper Drive for driving stepper motors. Each phase has 2A continuous driving capability. The system is provided with user selectable controls like speed, steps, direction, and mode. This system is provided with RS-232 interface, to accept commands from PC. This system also has local keyboard and LCD interface to use in Stand-alone mode (local Mode). (author)

  10. Statistical MOSFET Parameter Extraction with Parameter Selection for Minimal Point Measurement

    Directory of Open Access Journals (Sweden)

    Marga Alisjahbana

    2013-11-01

    Full Text Available A method to statistically extract MOSFET model parameters from a minimal number of transistor I(V characteristic curve measurements, taken during fabrication process monitoring. It includes a sensitivity analysis of the model, test/measurement point selection, and a parameter extraction experiment on the process data. The actual extraction is based on a linear error model, the sensitivity of the MOSFET model with respect to the parameters, and Newton-Raphson iterations. Simulated results showed good accuracy of parameter extraction and I(V curve fit for parameter deviations of up 20% from nominal values, including for a process shift of 10% from nominal.

  11. Metal ion extractant in microemulsions: where solvent extraction and surfactant science meet

    International Nuclear Information System (INIS)

    Bauer, C.

    2011-01-01

    The presented work describes the supramolecular structure of mixtures of a hydrophilic surfactant n-octyl-beta-glucoside (C8G1), and the hydrophobic metal ion extractant tributylphosphate (TBP) in n-dodecane/water as well as in the presence of salts. In the first part, basic solvent extraction system, composed of water, oil and extractant, will be introduced. The focus, however, lies on the extraction of multivalent metal ions from the aqueous phase. During this extraction process and in the following thermodynamic equilibrium, aggregation and phase transition in supramolecular assemblies occur, which are already described in literature. Notably, these reports rest on individual studies and specific conclusions, while a general concept is still missing. We therefore suggest the use of generalized phase diagrams to present the physico-chemical behaviour of (amphiphilic) extractant systems. These phase diagrams facilitated the development of a thermodynamic model based on molecular geometry and packing of the extractant molecules in the oil phase. As a result, we are now in the position to predict size and water content of extractant aggregates and, thus, verify the experimental results by calculation.Consequently, the second part presents a systematic study of the aqueous and organic phase of water/C8G1 and water/oil/TBP mixtures. The focus lies on understanding the interaction between metal ions and both amphiphilic molecules by means of small angle x-ray scattering (SAXS), dynamic light scattering (DLS) and UV-Vis spectroscopy. We confirmed the assumption that extraction of metal ions is driven by TBP, while C8G1 remains passive. In the third and last part, microemulsions of C8G1, TBP, water (and salt) and n-dodecane are characterized by small angle neutron scattering (SANS), and chemical analytics (Karl Fischer, total organic carbon, ICP-OES,...). The co-surfactant behaviour of TBP was highlighted by comparison to the classical n-alcohol (4≤n≤8) co

  12. Phenolics extraction from sweet potato peels: modelling and optimization by response surface modelling and artificial neural network.

    Science.gov (United States)

    Anastácio, Ana; Silva, Rúben; Carvalho, Isabel S

    2016-12-01

    Sweet potato peels (SPP) are a major waste generated during root processing and currently have little commercial value. Phenolics with free radical scavenging activity from SPP may represent a possible added-value product for the food industry. The aqueous extraction of phenolics from SPP was studied using a Central Composite Design with solvent to solid ratio (30-60 mL g -1 ), time (30-90 min) and temperature (25-75 °C) as independent variables. The comparison of response surface methodology (RSM) and artificial neural network (ANN) analysis on extraction modelling and optimising was performed. Temperature and solvent to solid ratio, alone and in interaction, presented a positive effect in TPC, ABTS and DPPH assays. Time was only significant for ABTS assay with a negative influence both as main effect and in interaction with other independent variables. RSM and ANN models predicted the same optimal extraction conditions as 60 mL g -1 for solvent to solid ratio, 30 min for time and 75 °C for temperature. The obtained responses in the optimized conditions were as follow: 11.87 ± 0.69 mg GAE g -1 DM for TPC, 12.91 ± 0.42 mg TE g -1 DM for ABTS assay and 46.35 ± 3.08 mg TE g -1 DM for DPPH assay. SPP presented similar optimum extraction conditions and phenolic content than peels of potato, tea fruit and bambangan. Predictive models and the optimized extraction conditions offers an opportunity for food processors to generate products with high potential health benefits.

  13. Supercritical carbon dioxide hop extraction

    Directory of Open Access Journals (Sweden)

    Pfaf-Šovljanski Ivana I.

    2005-01-01

    Full Text Available The hop of Magnum cultivar was extracted using supercritical carbon dioxide (SFE-as extractant. Extraction was carried out in the two steps: the first one being carried out at 150 bar and 40°C for 2.5 h (Extract A, and the second was the extraction of the same hop sample at 300 bar and 40°C for 2.5 h (Extract B. Extraction kinetics of the system hop-SFE-CO2 was investigated. Two of four most common compounds of hop aroma (α-humulene and β-caryophyllene were detected in Extract A. Isomerised α-acids and β-acids were detected too. a-Acid content in Extract B was high (that means it is a bitter variety of hop. Mathematical modeling using empirical model characteristic time model and simple single sphere model has been performed on Magnum cultivar extraction experimental results. Characteristic time model equations, best fitted experimental results. Empirical model equation, fitted results well, while simple single sphere model equation poorly approximated the results.

  14. Driving profile modeling and recognition based on soft computing approach.

    Science.gov (United States)

    Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya

    2009-04-01

    Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers.

  15. Answer Extraction Based on Merging Score Strategy of Hot Terms

    Institute of Scientific and Technical Information of China (English)

    LE Juan; ZHANG Chunxia; NIU Zhendong

    2016-01-01

    Answer extraction (AE) is one of the key technologies in developing the open domain Question&an-swer (Q&A) system . Its task is to yield the highest score to the expected answer based on an effective answer score strategy. We introduce an answer extraction method by Merging score strategy (MSS) based on hot terms. The hot terms are defined according to their lexical and syn-tactic features to highlight the role of the question terms. To cope with the syntactic diversities of the corpus, we propose four improved candidate answer score algorithms. Each of them is based on the lexical function of hot terms and their syntactic relationships with the candidate an-swers. Two independent corpus score algorithms are pro-posed to tap the role of the corpus in ranking the candi-date answers. Six algorithms are adopted in MSS to tap the complementary action among the corpus, the candi-date answers and the questions. Experiments demonstrate the effectiveness of the proposed strategy.

  16. Quantum Jarzynski equality of measurement-based work extraction.

    Science.gov (United States)

    Morikuni, Yohei; Tajima, Hiroyasu; Hatano, Naomichi

    2017-03-01

    Many studies of quantum-size heat engines assume that the dynamics of an internal system is unitary and that the extracted work is equal to the energy loss of the internal system. Both assumptions, however, should be under scrutiny. In the present paper, we analyze quantum-scale heat engines, employing the measurement-based formulation of the work extraction recently introduced by Hayashi and Tajima [M. Hayashi and H. Tajima, arXiv:1504.06150]. We first demonstrate the inappropriateness of the unitary time evolution of the internal system (namely, the first assumption above) using a simple two-level system; we show that the variance of the energy transferred to an external system diverges when the dynamics of the internal system is approximated to a unitary time evolution. Second, we derive the quantum Jarzynski equality based on the formulation of Hayashi and Tajima as a relation for the work measured by an external macroscopic apparatus. The right-hand side of the equality reduces to unity for "natural" cyclic processes but fluctuates wildly for noncyclic ones, exceeding unity often. This fluctuation should be detectable in experiments and provide evidence for the present formulation.

  17. The Extraction of Post-Earthquake Building Damage Informatiom Based on Convolutional Neural Network

    Science.gov (United States)

    Chen, M.; Wang, X.; Dou, A.; Wu, X.

    2018-04-01

    The seismic damage information of buildings extracted from remote sensing (RS) imagery is meaningful for supporting relief and effective reduction of losses caused by earthquake. Both traditional pixel-based and object-oriented methods have some shortcoming in extracting information of object. Pixel-based method can't make fully use of contextual information of objects. Object-oriented method faces problem that segmentation of image is not ideal, and the choice of feature space is difficult. In this paper, a new stratage is proposed which combines Convolution Neural Network (CNN) with imagery segmentation to extract building damage information from remote sensing imagery. the key idea of this method includes two steps. First to use CNN to predicate the probability of each pixel and then integrate the probability within each segmentation spot. The method is tested through extracting the collapsed building and uncollapsed building from the aerial image which is acquired in Longtoushan Town after Ms 6.5 Ludian County, Yunnan Province earthquake. The results show that the proposed method indicates its effectiveness in extracting damage information of buildings after earthquake.

  18. Extraction of Multithread Channel Networks With a Reduced-Complexity Flow Model

    Science.gov (United States)

    Limaye, Ajay B.

    2017-10-01

    Quantitative measures of channel network geometry inform diverse applications in hydrology, sediment transport, ecology, hazard assessment, and stratigraphic prediction. These uses require a clear, objectively defined channel network. Automated techniques for extracting channels from topography are well developed for convergent channel networks and identify flow paths based on land-surface gradients. These techniques—even when they allow multiple flow paths—do not consistently capture channel networks with frequent bifurcations (e.g., in rivers, deltas, and alluvial fans). This paper uses multithread rivers as a template to develop a new approach for channel extraction suitable for channel networks with divergences. Multithread channels are commonly mapped using observed inundation extent, and I generalize this approach using a depth-resolving, reduced-complexity flow model to map inundation patterns for fixed topography across an arbitrary range of discharge. A case study for the Platte River, Nebraska, reveals that (1) the number of bars exposed above the water surface, bar area, and the number of wetted channel threads (i.e., braiding index) peak at intermediate discharge; (2) the anisotropic scaling of bar dimensions occurs for a range of discharge; and (3) the maximum braiding index occurs at a corresponding reference discharge that provides an objective basis for comparing the planform geometry of multithread rivers. Mapping by flow depth overestimates braiding index by a factor of 2. The new approach extends channel network extraction from topography to the full spectrum of channel patterns, with the potential for comparing diverse channel patterns at scales from laboratory experiments to natural landscapes.

  19. Analysis of current and alternative phenol based RNA extraction methodologies for cyanobacteria

    Directory of Open Access Journals (Sweden)

    Lindblad Peter

    2009-08-01

    Full Text Available Abstract Background The validity and reproducibility of gene expression studies depend on the quality of extracted RNA and the degree of genomic DNA contamination. Cyanobacteria are gram-negative prokaryotes that synthesize chlorophyll a and carry out photosynthetic water oxidation. These organisms possess an extended array of secondary metabolites that impair cell lysis, presenting particular challenges when it comes to nucleic acid isolation. Therefore, we used the NHM5 strain of Nostoc punctiforme ATCC 29133 to compare and improve existing phenol based chemistry and procedures for RNA extraction. Results With this work we identify and explore strategies for improved and lower cost high quality RNA isolation from cyanobacteria. All the methods studied are suitable for RNA isolation and its use for downstream applications. We analyse different Trizol based protocols, introduce procedural changes and describe an alternative RNA extraction solution. Conclusion It was possible to improve purity of isolated RNA by modifying protocol procedures. Further improvements, both in RNA purity and experimental cost, were achieved by using a new extraction solution, PGTX.

  20. Comparison of water extraction methods in Tibet based on GF-1 data

    Science.gov (United States)

    Jia, Lingjun; Shang, Kun; Liu, Jing; Sun, Zhongqing

    2018-03-01

    In this study, we compared four different water extraction methods with GF-1 data according to different water types in Tibet, including Support Vector Machine (SVM), Principal Component Analysis (PCA), Decision Tree Classifier based on False Normalized Difference Water Index (FNDWI-DTC), and PCA-SVM. The results show that all of the four methods can extract large area water body, but only SVM and PCA-SVM can obtain satisfying extraction results for small size water body. The methods were evaluated by both overall accuracy (OAA) and Kappa coefficient (KC). The OAA of PCA-SVM, SVM, FNDWI-DTC, PCA are 96.68%, 94.23%, 93.99%, 93.01%, and the KCs are 0.9308, 0.8995, 0.8962, 0.8842, respectively, in consistent with visual inspection. In summary, SVM is better for narrow rivers extraction and PCA-SVM is suitable for water extraction of various types. As for dark blue lakes, the methods using PCA can extract more quickly and accurately.

  1. SIFT Based Vein Recognition Models: Analysis and Improvement

    Directory of Open Access Journals (Sweden)

    Guoqing Wang

    2017-01-01

    Full Text Available Scale-Invariant Feature Transform (SIFT is being investigated more and more to realize a less-constrained hand vein recognition system. Contrast enhancement (CE, compensating for deficient dynamic range aspects, is a must for SIFT based framework to improve the performance. However, evidence of negative influence on SIFT matching brought by CE is analysed by our experiments. We bring evidence that the number of extracted keypoints resulting by gradient based detectors increases greatly with different CE methods, while on the other hand the matching result of extracted invariant descriptors is negatively influenced in terms of Precision-Recall (PR and Equal Error Rate (EER. Rigorous experiments with state-of-the-art and other CE adopted in published SIFT based hand vein recognition system demonstrate the influence. What is more, an improved SIFT model by importing the kernel of RootSIFT and Mirror Match Strategy into a unified framework is proposed to make use of the positive keypoints change and make up for the negative influence brought by CE.

  2. Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis

    International Nuclear Information System (INIS)

    Wang, M; Hu, N Q; Qin, G J

    2011-01-01

    In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.

  3. Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, M; Hu, N Q; Qin, G J, E-mail: hnq@nudt.edu.cn, E-mail: wm198063@yahoo.com.cn [School of Mechatronic Engineering and Automation, National University of Defense Technology, ChangSha, Hunan, 410073 (China)

    2011-07-19

    In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.

  4. Image segmentation of overlapping leaves based on Chan–Vese model and Sobel operator

    Directory of Open Access Journals (Sweden)

    Zhibin Wang

    2018-03-01

    Full Text Available To improve the segmentation precision of overlapping crop leaves, this paper presents an effective image segmentation method based on the Chan–Vese model and Sobel operator. The approach consists of three stages. First, a feature that identifies hues with relatively high levels of green is used to extract the region of leaves and remove the background. Second, the Chan–Vese model and improved Sobel operator are implemented to extract the leaf contours and detect the edges, respectively. Third, a target leaf with a complex background and overlapping is extracted by combining the results obtained by the Chan–Vese model and Sobel operator. To verify the effectiveness of the proposed algorithm, a segmentation experiment was performed on 30 images of cucumber leaf. The mean error rate of the proposed method is 0.0428, which is a decrease of 6.54% compared with the mean error rate of the level set method. Experimental results show that the proposed method can accurately extract the target leaf from cucumber leaf images with complex backgrounds and overlapping regions.

  5. Extraction of domoic acid from seawater and urine using a resin based on 2-(trifluoromethyl)acrylic acid.

    Science.gov (United States)

    Piletska, Elena V; Villoslada, Fernando Navarro; Chianella, Iva; Bossi, Alessandra; Karim, Kal; Whitcombe, Michael J; Piletsky, Sergey A; Doucette, Gregory J; Ramsdell, John S

    2008-03-03

    A new solid-phase extraction (SPE) matrix with high affinity for the neurotoxin domoic acid (DA) was designed and tested. A computational modelling study led to the selection of 2-(trifluoromethyl)acrylic acid (TFMAA) as a functional monomer capable of imparting affinity towards domoic acid. Polymeric adsorbents containing TFMAA were synthesised and tested in high ionic strength solutions such as urine and seawater. The TFMAA-based polymers demonstrated excellent performance in solid-phase extraction of domoic acid, retaining the toxin while salts and other interfering compounds such as aspartic and glutamic acids were removed by washing and selective elution. It was shown that the TFMAA-based polymer provided the level of purification of domoic acid from urine and seawater acceptable for its quantification by high performance liquid chromatography-mass spectrometry (HPLC-MS) and enzyme-linked immunosorbent assay (ELISA) without any additional pre-concentration and purification steps.

  6. Model of pulse extraction from a copper laser amplifier

    International Nuclear Information System (INIS)

    Boley, C.D.; Warner, B.E.

    1997-03-01

    A computational model of pulse propagation through a copper laser amplifier has been developed. The model contains a system of 1-D (in the axial direction), time-dependent equations for the laser intensity and amplified spontaneous emission (ASE), coupled to rate equations for the atomic levels. Detailed calculations are presented for a high-power amplifier at Lawrence Livermore National Laboratory. The extracted power agrees with experiment near saturation. At lower input power the calculation overestimates experiment, probably because of increased ASE effects. 6 refs., 6 figs

  7. Incremental Ontology-Based Extraction and Alignment in Semi-structured Documents

    Science.gov (United States)

    Thiam, Mouhamadou; Bennacer, Nacéra; Pernelle, Nathalie; Lô, Moussa

    SHIRIis an ontology-based system for integration of semi-structured documents related to a specific domain. The system’s purpose is to allow users to access to relevant parts of documents as answers to their queries. SHIRI uses RDF/OWL for representation of resources and SPARQL for their querying. It relies on an automatic, unsupervised and ontology-driven approach for extraction, alignment and semantic annotation of tagged elements of documents. In this paper, we focus on the Extract-Align algorithm which exploits a set of named entity and term patterns to extract term candidates to be aligned with the ontology. It proceeds in an incremental manner in order to populate the ontology with terms describing instances of the domain and to reduce the access to extern resources such as Web. We experiment it on a HTML corpus related to call for papers in computer science and the results that we obtain are very promising. These results show how the incremental behaviour of Extract-Align algorithm enriches the ontology and the number of terms (or named entities) aligned directly with the ontology increases.

  8. A method for extracting design rationale knowledge based on Text Mining

    Directory of Open Access Journals (Sweden)

    Liu Jihong

    2017-01-01

    Full Text Available Capture design rationale (DR knowledge and presenting it to designers by good form, which have great significance for design reuse and design innovation. Since the 1970s design rationality began to develop, many teams have developed their own design rational system. However, the DR acquisition system is not intelligent enough, and it still requires designers to do a lot of operations. In addition, the existing design documents contain a large number of DR knowledge, but it has not been well excavated. Therefore, a method and system are needed to better extract DR knowledge in design documents. We have proposed a DRKH (design rationale knowledge hierarchy model for DR representation. The DRKH model has three layers, respectively as design intent layer, design decision layer and design basis layer. In this paper, we use text mining method to extract DR from design documents and construct DR model. Finally, the welding robot design specification is taken as an example to demonstrate the system interface.

  9. MATCHING AERIAL IMAGES TO 3D BUILDING MODELS BASED ON CONTEXT-BASED GEOMETRIC HASHING

    Directory of Open Access Journals (Sweden)

    J. Jung

    2016-06-01

    Full Text Available In this paper, a new model-to-image framework to automatically align a single airborne image with existing 3D building models using geometric hashing is proposed. As a prerequisite process for various applications such as data fusion, object tracking, change detection and texture mapping, the proposed registration method is used for determining accurate exterior orientation parameters (EOPs of a single image. This model-to-image matching process consists of three steps: 1 feature extraction, 2 similarity measure and matching, and 3 adjustment of EOPs of a single image. For feature extraction, we proposed two types of matching cues, edged corner points representing the saliency of building corner points with associated edges and contextual relations among the edged corner points within an individual roof. These matching features are extracted from both 3D building and a single airborne image. A set of matched corners are found with given proximity measure through geometric hashing and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on co-linearity equations. The result shows that acceptable accuracy of single image's EOP can be achievable by the proposed registration approach as an alternative to labour-intensive manual registration process.

  10. a Quadtree Organization Construction and Scheduling Method for Urban 3d Model Based on Weight

    Science.gov (United States)

    Yao, C.; Peng, G.; Song, Y.; Duan, M.

    2017-09-01

    The increasement of Urban 3D model precision and data quantity puts forward higher requirements for real-time rendering of digital city model. Improving the organization, management and scheduling of 3D model data in 3D digital city can improve the rendering effect and efficiency. This paper takes the complexity of urban models into account, proposes a Quadtree construction and scheduling rendering method for Urban 3D model based on weight. Divide Urban 3D model into different rendering weights according to certain rules, perform Quadtree construction and schedule rendering according to different rendering weights. Also proposed an algorithm for extracting bounding box extraction based on model drawing primitives to generate LOD model automatically. Using the algorithm proposed in this paper, developed a 3D urban planning&management software, the practice has showed the algorithm is efficient and feasible, the render frame rate of big scene and small scene are both stable at around 25 frames.

  11. A QUADTREE ORGANIZATION CONSTRUCTION AND SCHEDULING METHOD FOR URBAN 3D MODEL BASED ON WEIGHT

    Directory of Open Access Journals (Sweden)

    C. Yao

    2017-09-01

    Full Text Available The increasement of Urban 3D model precision and data quantity puts forward higher requirements for real-time rendering of digital city model. Improving the organization, management and scheduling of 3D model data in 3D digital city can improve the rendering effect and efficiency. This paper takes the complexity of urban models into account, proposes a Quadtree construction and scheduling rendering method for Urban 3D model based on weight. Divide Urban 3D model into different rendering weights according to certain rules, perform Quadtree construction and schedule rendering according to different rendering weights. Also proposed an algorithm for extracting bounding box extraction based on model drawing primitives to generate LOD model automatically. Using the algorithm proposed in this paper, developed a 3D urban planning&management software, the practice has showed the algorithm is efficient and feasible, the render frame rate of big scene and small scene are both stable at around 25 frames.

  12. Application of ionic liquids based enzyme-assisted extraction of chlorogenic acid from Eucommia ulmoides leaves

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Tingting; Sui, Xiaoyu, E-mail: suixiaoyu@outlook.com; Li, Li; Zhang, Jie; Liang, Xin; Li, Wenjing; Zhang, Honglian; Fu, Shuang

    2016-01-15

    A new approach for ionic liquid based enzyme-assisted extraction (ILEAE) of chlorogenic acid (CGA) from Eucommia ulmoides is presented in which enzyme pretreatment was used in ionic liquids aqueous media to enhance extraction yield. For this purpose, the solubility of CGA and the activity of cellulase were investigated in eight 1-alkyl-3-methylimidazolium ionic liquids. Cellulase in 0.5 M [C6mim]Br aqueous solution was found to provide better performance in extraction. The factors of ILEAE procedures including extraction time, extraction phase pH, extraction temperatures and enzyme concentrations were investigated. Moreover, the novel developed approach offered advantages in term of yield and efficiency compared with other conventional extraction techniques. Scanning electronic microscopy of plant samples indicated that cellulase treated cell wall in ionic liquid solution was subjected to extract, which led to more efficient extraction by reducing mass transfer barrier. The proposed ILEAE method would develope a continuous process for enzyme-assisted extraction including enzyme incubation and solvent extraction process. In this research, we propose a novel view for enzyme-assisted extraction of plant active component, besides concentrating on enzyme facilitated cell wall degradation, focusing on improvement of bad permeability of ionic liquids solutions. - Highlights: • An ionic liquid based enzyme-assisted extraction method of natural product was explored. • ILEAE utilizes enzymatic treatment to improve permeability of ionic liquids solution. • Enzyme incubation and solvent extraction process were ongoing simultaneously. • ILEAE process simplified operating process and suitable for more complete extraction.

  13. Application of ionic liquids based enzyme-assisted extraction of chlorogenic acid from Eucommia ulmoides leaves

    International Nuclear Information System (INIS)

    Liu, Tingting; Sui, Xiaoyu; Li, Li; Zhang, Jie; Liang, Xin; Li, Wenjing; Zhang, Honglian; Fu, Shuang

    2016-01-01

    A new approach for ionic liquid based enzyme-assisted extraction (ILEAE) of chlorogenic acid (CGA) from Eucommia ulmoides is presented in which enzyme pretreatment was used in ionic liquids aqueous media to enhance extraction yield. For this purpose, the solubility of CGA and the activity of cellulase were investigated in eight 1-alkyl-3-methylimidazolium ionic liquids. Cellulase in 0.5 M [C6mim]Br aqueous solution was found to provide better performance in extraction. The factors of ILEAE procedures including extraction time, extraction phase pH, extraction temperatures and enzyme concentrations were investigated. Moreover, the novel developed approach offered advantages in term of yield and efficiency compared with other conventional extraction techniques. Scanning electronic microscopy of plant samples indicated that cellulase treated cell wall in ionic liquid solution was subjected to extract, which led to more efficient extraction by reducing mass transfer barrier. The proposed ILEAE method would develope a continuous process for enzyme-assisted extraction including enzyme incubation and solvent extraction process. In this research, we propose a novel view for enzyme-assisted extraction of plant active component, besides concentrating on enzyme facilitated cell wall degradation, focusing on improvement of bad permeability of ionic liquids solutions. - Highlights: • An ionic liquid based enzyme-assisted extraction method of natural product was explored. • ILEAE utilizes enzymatic treatment to improve permeability of ionic liquids solution. • Enzyme incubation and solvent extraction process were ongoing simultaneously. • ILEAE process simplified operating process and suitable for more complete extraction.

  14. Trend extraction of rail corrugation measured dynamically based on the relevant low-frequency principal components reconstruction

    International Nuclear Information System (INIS)

    Li, Yanfu; Liu, Hongli; Ma, Ziji

    2016-01-01

    Rail corrugation dynamic measurement techniques are critical to guarantee transport security and guide rail maintenance. During the inspection process, low-frequency trends caused by rail fluctuation are usually superimposed on rail corrugation and seriously affect the assessment of rail maintenance quality. In order to extract and remove the nonlinear and non-stationary trends from original mixed signals, a hybrid model based ensemble empirical mode decomposition (EEMD) and modified principal component analysis (MPCA) is proposed in this paper. Compared with the existing de-trending methods based on EMD, this method first considers low-frequency intrinsic mode functions (IMFs) thought to be underlying trend components that maybe contain some unrelated components, such as white noise and low-frequency signal itself, and proposes to use PCA to accurately extract the pure trends from the IMFs containing multiple components. On the other hand, due to the energy contribution ratio between trends and mixed signals is prior unknown, and the principal components (PCs) decomposed by PCA are arranged in order of energy reduction without considering frequency distribution, the proposed method modifies traditional PCA and just selects relevant low-frequency PCs to reconstruct the trends based on the zero-crossing numbers (ZCN) of each PC. Extensive tests are presented to illustrate the effectiveness of the proposed method. The results show the proposed EEMD-PCA-ZCN is an effective tool for trend extraction of rail corrugation measured dynamically. (paper)

  15. Aircraft Segmentation in SAR Images Based on Improved Active Shape Model

    Science.gov (United States)

    Zhang, X.; Xiong, B.; Kuang, G.

    2018-04-01

    In SAR image interpretation, aircrafts are the important targets arousing much attention. However, it is far from easy to segment an aircraft from the background completely and precisely in SAR images. Because of the complex structure, different kinds of electromagnetic scattering take place on the aircraft surfaces. As a result, aircraft targets usually appear to be inhomogeneous and disconnected. It is a good idea to extract an aircraft target by the active shape model (ASM), since combination of the geometric information controls variations of the shape during the contour evolution. However, linear dimensionality reduction, used in classic ACM, makes the model rigid. It brings much trouble to segment different types of aircrafts. Aiming at this problem, an improved ACM based on ISOMAP is proposed in this paper. ISOMAP algorithm is used to extract the shape information of the training set and make the model flexible enough to deal with different aircrafts. The experiments based on real SAR data shows that the proposed method achieves obvious improvement in accuracy.

  16. EM simulation assisted parameter extraction for the modeling of transferred-substrate InP HBTs

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Weimann, Nils; Doerner, Ralf

    2017-01-01

    In this paper an electromagnetic (EM) simulation assisted parameters extraction procedure is demonstrated for accurate modeling of down-scaled transferred-substrate InP HBTs. The external parasitic network associated with via transitions and device electrodes is carefully extracted from calibrate...

  17. Research on Methods of High Coherent Target Extraction in Urban Area Based on Psinsar Technology

    Science.gov (United States)

    Li, N.; Wu, J.

    2018-04-01

    PSInSAR technology has been widely applied in ground deformation monitoring. Accurate identification of Persistent Scatterers (PS) is key to the success of PSInSAR data processing. In this paper, the theoretic models and specific algorithms of PS point extraction methods are summarized and the characteristics and applicable conditions of each method, such as Coherence Coefficient Threshold method, Amplitude Threshold method, Dispersion of Amplitude method, Dispersion of Intensity method, are analyzed. Based on the merits and demerits of different methods, an improved method for PS point extraction in urban area is proposed, that uses simultaneously backscattering characteristic, amplitude and phase stability to find PS point in all pixels. Shanghai city is chosen as an example area for checking the improvements of the new method. The results show that the PS points extracted by the new method have high quality, high stability and meet the strong scattering characteristics. Based on these high quality PS points, the deformation rate along the line-of-sight (LOS) in the central urban area of Shanghai is obtained by using 35 COSMO-SkyMed X-band SAR images acquired from 2008 to 2010 and it varies from -14.6 mm/year to 4.9 mm/year. There is a large sedimentation funnel in the cross boundary of Hongkou and Yangpu district with a maximum sedimentation rate of more than 14 mm per year. The obtained ground subsidence rates are also compared with the result of spirit leveling and show good consistent. Our new method for PS point extraction is more reasonable, and can improve the accuracy of the obtained deformation results.

  18. Effect of Withania somnifera leaf extract on the dietary supplementation in transgenic Drosophila model of Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    YASIR HASAN SIDDIQUE

    2015-09-01

    Full Text Available The role of Withania somnifera L. leaf extract was studied on the transgenic Drosophila model flies expressing normal human alpha synuclein (h-αS in the neurons. The leaf extract was prepared in acetone and was subjected to GC-MS analysis. W. somnifera extract at final concentration of 0.25, 0.50 and 1.0 µL/mL was mixed with the diet and the flies were allowed to feed for 24 days. The effect of extract was studied on the climbing ability, lipid peroxidation and protein carbonyl content in the brains of transgenic Drosophila. The exposure of extract to PD model flies did not show any significant delay in the loss of climbing ability nor reduced the oxidative stress in the brains of transgenic Drosophila as compared to untreated PD model flies. The results suggest that W. somnifera leaf extract is not potent in reducing the PD symptoms in transgenic Drosophila model of Parkinson’s disease.

  19. Ingenious Snake: An Adaptive Multi-Class Contours Extraction

    Science.gov (United States)

    Li, Baolin; Zhou, Shoujun

    2018-04-01

    Active contour model (ACM) plays an important role in computer vision and medical image application. The traditional ACMs were used to extract single-class of object contours. While, simultaneous extraction of multi-class of interesting contours (i.e., various contours with closed- or open-ended) have not been solved so far. Therefore, a novel ACM model named “Ingenious Snake” is proposed to adaptively extract these interesting contours. In the first place, the ridge-points are extracted based on the local phase measurement of gradient vector flow field; the consequential ridgelines initialization are automated with high speed. Secondly, the contours’ deformation and evolvement are implemented with the ingenious snake. In the experiments, the result from initialization, deformation and evolvement are compared with the existing methods. The quantitative evaluation of the structure extraction is satisfying with respect of effectiveness and accuracy.

  20. Prediction Model of Machining Failure Trend Based on Large Data Analysis

    Science.gov (United States)

    Li, Jirong

    2017-12-01

    The mechanical processing has high complexity, strong coupling, a lot of control factors in the machining process, it is prone to failure, in order to improve the accuracy of fault detection of large mechanical equipment, research on fault trend prediction requires machining, machining fault trend prediction model based on fault data. The characteristics of data processing using genetic algorithm K mean clustering for machining, machining feature extraction which reflects the correlation dimension of fault, spectrum characteristics analysis of abnormal vibration of complex mechanical parts processing process, the extraction method of the abnormal vibration of complex mechanical parts processing process of multi-component spectral decomposition and empirical mode decomposition Hilbert based on feature extraction and the decomposition results, in order to establish the intelligent expert system for the data base, combined with large data analysis method to realize the machining of the Fault trend prediction. The simulation results show that this method of fault trend prediction of mechanical machining accuracy is better, the fault in the mechanical process accurate judgment ability, it has good application value analysis and fault diagnosis in the machining process.

  1. Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video

    Directory of Open Access Journals (Sweden)

    Gil-beom Lee

    2017-03-01

    Full Text Available Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos.

  2. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  3. Spectral features based tea garden extraction from digital orthophoto maps

    Science.gov (United States)

    Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun

    2018-05-01

    The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.

  4. Extraction of Biomolecules Using Phosphonium-Based Ionic Liquids + K3PO4 Aqueous Biphasic Systems

    Science.gov (United States)

    Louros, Cláudia L. S.; Cláudio, Ana Filipa M.; Neves, Catarina M. S. S.; Freire, Mara G.; Marrucho, Isabel M.; Pauly, Jérôme; Coutinho, João A. P.

    2010-01-01

    Aqueous biphasic systems (ABS) provide an alternative and efficient approach for the extraction, recovery and purification of biomolecules through their partitioning between two liquid aqueous phases. In this work, the ability of hydrophilic phosphonium-based ionic liquids (ILs) to form ABS with aqueous K3PO4 solutions was evaluated for the first time. Ternary phase diagrams, and respective tie-lines and tie-lines length, formed by distinct phosphonium-based ILs, water, and K3PO4 at 298 K, were measured and are reported. The studied phosphonium-based ILs have shown to be more effective in promoting ABS compared to the imidazolium-based counterparts with similar anions. Moreover, the extractive capability of such systems was assessed for distinct biomolecules (including amino acids, food colourants and alkaloids). Densities and viscosities of both aqueous phases, at the mass fraction compositions used for the biomolecules extraction, were also determined. The evaluated IL-based ABS have been shown to be prospective extraction media, particularly for hydrophobic biomolecules, with several advantages over conventional polymer-inorganic salt ABS. PMID:20480041

  5. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei—Implications for comparative studies

    International Nuclear Information System (INIS)

    Akbari, Mansour; Krokan, Hans E.

    2012-01-01

    Highlights: • We examine effect of volume of extraction buffer relative to volume of isolated nuclei on repair activity of nuclear extract. • Base excision repair activity of nuclear extracts prepared from the same batch and number of nuclei varies inversely with the volume of nuclear extraction buffer. • Effect of the volume of extraction buffer on BER activity of nuclear extracts can only be partially reversed after concentration of the more diluted extract by ultrafiltration. - Abstract: The base excision repair (BER) pathway corrects many different DNA base lesions and is important for genomic stability. The mechanism of BER cannot easily be investigated in intact cells and therefore in vitro methods that reflect the in vivo processes are in high demand. Reconstitution of BER using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore, candidate proteins in extracts can be inhibited or depleted in a controlled way, making defined extracts an important source for mechanistic studies. The major drawback is that there is no standardized method of preparing nuclear extract for BER studies, and it does not appear to be a topic given much attention. Here we have examined BER activity of nuclear cell extracts from HeLa cells, using as substrate a circular DNA molecule with either uracil or an AP-site in a defined position. We show that BER activity of nuclear extracts from the same batch of cells varies inversely with the volume of nuclear extraction buffer relative to nuclei volume, in spite of identical protein concentrations in the BER assay mixture. Surprisingly, the uracil–DNA glycosylase activity (mainly UNG2), but not amount of UNG2, also correlated negatively with the volume of extraction buffer. These studies demonstrate

  6. Ionic liquid-based microwave-assisted extraction of flavonoids from Bauhinia championii (Benth.) Benth.

    Science.gov (United States)

    Xu, Wei; Chu, Kedan; Li, Huang; Zhang, Yuqin; Zheng, Haiyin; Chen, Ruilan; Chen, Lidian

    2012-12-03

    An ionic liquids (IL)-based microwave-assisted approach for extraction and determination of flavonoids from Bauhinia championii (Benth.) Benth. was proposed for the first time. Several ILs with different cations and anions and the microwave-assisted extraction (MAE) conditions, including sample particle size, extraction time and liquid-solid ratio, were investigated. Two M 1-butyl-3-methylimidazolium bromide ([bmim] Br) solution with 0.80 M HCl was selected as the optimal solvent. Meanwhile the optimized conditions a ratio of liquid to material of 30:1, and the extraction for 10 min at 70 °C. Compared with conventional heat-reflux extraction (CHRE) and the regular MAE, IL-MAE exhibited a higher extraction yield and shorter extraction time (from 1.5 h to 10 min). The optimized extraction samples were analysed by LC-MS/MS. IL extracts of Bauhinia championii (Benth.)Benth consisted mainly of flavonoids, among which myricetin, quercetin and kaempferol, β-sitosterol, triacontane and hexacontane were identified. The study indicated that IL-MAE was an efficient and rapid method with simple sample preparation. LC-MS/MS was also used to determine the chemical composition of the ethyl acetate/MAE extract of Bauhinia championii (Benth.) Benth, and it maybe become a rapid method to determine the composition of new plant extracts.

  7. Object-based Morphological Building Index for Building Extraction from High Resolution Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    LIN Xiangguo

    2017-06-01

    Full Text Available Building extraction from high resolution remote sensing images is a hot research topic in the field of photogrammetry and remote sensing. In this article, an object-based morphological building index (OBMBI is constructed based on both image segmentation and graph-based top-hat reconstruction, and OBMBI is used for building extraction from high resolution remote sensing images. First, bidirectional mapping relationship between pixels, objects and graph-nodes are constructed. Second, the OBMBI image is built based on both graph-based top-hat reconstruction and the above mapping relationship. Third, a binary thresholding is performed on the OBMBI image, and the binary image is converted into vector format to derive the building polygons. Finally, the post-processing is made to optimize the extracted building polygons. Two images, including an aerial image and a panchromatic satellite image, are used to test both the proposed method and classic PanTex method. The experimental results suggest that our proposed method has a higher accuracy in building extraction than the classic PanTex method. On average, the correctness, the completeness and the quality of our method are respectively 9.49%, 11.26% and 14.11% better than those of the PanTex.

  8. Application of a 2D air flow model to soil vapor extraction and bioventing case studies

    International Nuclear Information System (INIS)

    Mohr, D.H.; Merz, P.H.

    1995-01-01

    Soil vapor extraction (SVE) is frequently the technology of choice to clean up hydrocarbon contamination in unsaturated soil. A two-dimensional air flow model provides a practical tool to evaluate pilot test data and estimate remediation rates for soil vapor extraction systems. The model predictions of soil vacuum versus distance are statistically compared to pilot test data for 65 SVE wells at 44 sites. For 17 of 21 sites where there was asphalt paving, the best agreement was obtained for boundary conditions with no barrier to air flow at the surface. The model predictions of air flow rates and stream lines around the well allow an estimate of the gasoline removal rates by both evaporation and bioremediation. The model can be used to quickly estimate the effective radius of influence, defined here as the maximum distance from the well where there is enough air flow to remove the contaminant present within the allowable time. The effective radius of influence is smaller than a radius of influence defined by soil vacuum only. For a case study, in situ bioremediation rates were estimated using the air flow model and compared to independent estimates based on changes in soil temperature. These estimate bioremediation rates for heavy fuel oil ranged from 2.5 to 11 mg oil degraded per kg soil per day, in agreement with values in the literature

  9. Ab initio and density functional theoretical design and screening of model crown ether based ligand (host) for extraction of lithium metal ion (guest): effect of donor and electronic induction.

    Science.gov (United States)

    Boda, Anil; Ali, Sk Musharaf; Rao, Hanmanth; Ghosh, Sandip K

    2012-08-01

    The structures, energetic and thermodynamic parameters of model crown ethers with different donor, cavity and electron donating/ withdrawing functional group have been determined with ab initio MP2 and density functional theory in gas and solvent phase. The calculated values of binding energy/ enthalpy for lithium ion complexation are marginally higher for hard donor based aza and oxa crown compared to soft donor based thia and phospha crown. The calculated values of binding enthalpy for lithium metal ion with 12C4 at MP2 level of theory is in good agreement with the available experimental result. The binding energy is altered due to the inductive effect imparted by the electron donating/ withdrawing group in crown ether, which is well correlated with the values of electron transfer. The role of entropy for extraction of hydrated lithium metal ion by different donor and functional group based ligand has been demonstrated. The HOMO-LUMO gap is decreased and dipole moment of the ligand is increased from gas phase to organic phase because of the dielectric constant of the solvent. The gas phase binding energy is reduced in solvent phase as the solvent molecules weaken the metal-ligand binding. The theoretical values of extraction energy for LiCl salt from aqueous solution in different organic solvent is validated by the experimental trend. The study presented here should contribute to the design of model host ligand and screening of solvent for metal ion recognition and thus can contribute in planning the experiments.

  10. Effect of non-condensable gas on heat transfer in steam turbine condenser and modelling of ejector pump system by controlling the gas extraction rate through extraction tubes

    International Nuclear Information System (INIS)

    Strušnik, Dušan; Golob, Marjan; Avsec, Jurij

    2016-01-01

    Graphical abstract: Control of the amount of the pumped gases through extraction tubes. The connecting locations interconnect the extraction tubes for STC gas pumping. The extraction tubes are fitted with 3 control valves to control the amount of the pumped gas depending on the temperature of the pumped gas. The amount of the pumped gas increases through the extraction tubes, where the pumped gases are cooler and decreases, at the same time, through the extraction tubes, where the pumped gases are warmer. As a result, pumping of a larger amount of NCG is ensured and of a smaller amount of CG, given that the NCG concentration is the highest on the colder places. This way, the total amount of the pumped gases from the STC can be reduced, the SEPS operates more efficiently and consumes less energy for its operation. - Highlights: • Impact of non-condensable gas on heat transfer in a steam turbine condenser. • The ejector system is optimised by selecting a Laval nozzle diameter. • Simulation model of the control of the amount of pumped gases through extraction tubes. • Neural network and fuzzy logic systems used to control gas extraction rate. • Simulation model was designed by using real process data from the thermal power plant. - Abstract: The paper describes the impact of non-condensable gas (NCG) on heat transfer in a steam turbine condenser (STC) and modelling of the steam ejector pump system (SEPS) by controlling the gas extraction rate through extraction tubes. The ideal connection points for the NCG extraction from the STC are identified by analysing the impact of the NCG on the heat transfer and measuring the existing system at a thermal power plant in Slovenia. A simulation model is designed using the Matlab software and Simulink, Neural Net Work, Fuzzy Logic and Curve Fitting Toolboxes, to control gas extraction rate through extraction tubes of the gas pumped from the STC, thus optimising the operation of the steam ejector pump system (SEPS). The

  11. A Method of Road Extraction from High-resolution Remote Sensing Images Based on Shape Features

    Directory of Open Access Journals (Sweden)

    LEI Xiaoqi

    2016-02-01

    Full Text Available Road extraction from high-resolution remote sensing image is an important and difficult task.Since remote sensing images include complicated information,the methods that extract roads by spectral,texture and linear features have certain limitations.Also,many methods need human-intervention to get the road seeds(semi-automatic extraction,which have the great human-dependence and low efficiency.The road-extraction method,which uses the image segmentation based on principle of local gray consistency and integration shape features,is proposed in this paper.Firstly,the image is segmented,and then the linear and curve roads are obtained by using several object shape features,so the method that just only extract linear roads are rectified.Secondly,the step of road extraction is carried out based on the region growth,the road seeds are automatic selected and the road network is extracted.Finally,the extracted roads are regulated by combining the edge information.In experiments,the images that including the better gray uniform of road and the worse illuminated of road surface were chosen,and the results prove that the method of this study is promising.

  12. Isogeometric finite element data structures based on Bézier extraction of T-splines

    NARCIS (Netherlands)

    Scott, M.A.; Borden, M.J.; Verhoosel, C.V.; Sederberg, T.W.; Hughes, T.J.R.

    2011-01-01

    We develop finite element data structures for T-splines based on Bézier extraction generalizing our previous work for NURBS. As in traditional finite element analysis, the extracted Bézier elements are defined in terms of a fixed set of polynomial basis functions, the so-called Bernstein basis. The

  13. Caustic-Side Solvent Extraction: Prediction of Cesium Extraction from Actual Wastes and Actual Waste Simulants

    International Nuclear Information System (INIS)

    Delmau, L.H.; Haverlock, T.J.; Sloop, F.V. Jr.; Moyer, B.A.

    2003-01-01

    This report presents the work that followed the CSSX model development completed in FY2002. The developed cesium and potassium extraction model was based on extraction data obtained from simple aqueous media. It was tested to ensure the validity of the prediction for the cesium extraction from actual waste. Compositions of the actual tank waste were obtained from the Savannah River Site personnel and were used to prepare defined simulants and to predict cesium distribution ratios using the model. It was therefore possible to compare the cesium distribution ratios obtained from the actual waste, the simulant, and the predicted values. It was determined that the predicted values agree with the measured values for the simulants. Predicted values also agreed, with three exceptions, with measured values for the tank wastes. Discrepancies were attributed in part to the uncertainty in the cation/anion balance in the actual waste composition, but likely more so to the uncertainty in the potassium concentration in the waste, given the demonstrated large competing effect of this metal on cesium extraction. It was demonstrated that the upper limit for the potassium concentration in the feed ought to not exceed 0.05 M in order to maintain suitable cesium distribution ratios

  14. Model-based recognition of 3-D objects by geometric hashing technique

    International Nuclear Information System (INIS)

    Severcan, M.; Uzunalioglu, H.

    1992-09-01

    A model-based object recognition system is developed for recognition of polyhedral objects. The system consists of feature extraction, modelling and matching stages. Linear features are used for object descriptions. Lines are obtained from edges using rotation transform. For modelling and recognition process, geometric hashing method is utilized. Each object is modelled using 2-D views taken from the viewpoints on the viewing sphere. A hidden line elimination algorithm is used to find these views from the wire frame model of the objects. The recognition experiments yielded satisfactory results. (author). 8 refs, 5 figs

  15. Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model Generation for ns-3

    Science.gov (United States)

    2015-12-01

    more protocols (especially at different layers of the OSI model ), implementing an inference engine to extract inter- and intrapacket dependencies, and...ARL-TR-7543 ● DEC 2015 US Army Research Laboratory Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model ...ICMP) Model Generation for ns-3 by Jaime C Acosta and Felipe Jovel Survivability/Lethality Analysis Directorate, ARL Felipe Sotelo and Caesar

  16. Building extraction for 3D city modelling using airborne laser ...

    African Journals Online (AJOL)

    Light detection and ranging (LiDAR) technology has become a standard tool for three-dimensional mapping because it offers fast rate of data acquisition with unprecedented level of accuracy. This study presents an approach to accurately extract and model building in three-dimensional space from airborne laser scanning ...

  17. Development of a Kelp-Type Structure Module in a Coastal Ocean Model to Assess the Hydrodynamic Impact of Seawater Uranium Extraction Technology

    Directory of Open Access Journals (Sweden)

    Taiping Wang

    2014-02-01

    Full Text Available With the rapid growth of global energy demand, interest in extracting uranium from seawater for nuclear energy has been renewed. While extracting seawater uranium is not yet commercially viable, it serves as a “backstop” to the conventional uranium resources and provides an essentially unlimited supply of uranium resource. With recent technology advances, extracting uranium from seawater could be economically feasible only when the extraction devices are deployed at a large scale (e.g., several hundred km2. There is concern however that the large scale deployment of adsorbent farms could result in potential impacts to the hydrodynamic flow field in an oceanic setting. In this study, a kelp-type structure module based on the classic momentum sink approach was incorporated into a coastal ocean model to simulate the blockage effect of a farm of passive uranium extraction devices on the flow field. The module was quantitatively validated against laboratory flume experiments for both velocity and turbulence profiles.Model results suggest that the reduction in ambient currents could range from 4% to 10% using adsorbent farm dimensions and mooring densities previously described in the literature and with typical drag coefficients.

  18. Chinese character recognition based on Gabor feature extraction and CNN

    Science.gov (United States)

    Xiong, Yudian; Lu, Tongwei; Jiang, Yongyuan

    2018-03-01

    As an important application in the field of text line recognition and office automation, Chinese character recognition has become an important subject of pattern recognition. However, due to the large number of Chinese characters and the complexity of its structure, there is a great difficulty in the Chinese character recognition. In order to solve this problem, this paper proposes a method of printed Chinese character recognition based on Gabor feature extraction and Convolution Neural Network(CNN). The main steps are preprocessing, feature extraction, training classification. First, the gray-scale Chinese character image is binarized and normalized to reduce the redundancy of the image data. Second, each image is convoluted with Gabor filter with different orientations, and the feature map of the eight orientations of Chinese characters is extracted. Third, the feature map through Gabor filters and the original image are convoluted with learning kernels, and the results of the convolution is the input of pooling layer. Finally, the feature vector is used to classify and recognition. In addition, the generalization capacity of the network is improved by Dropout technology. The experimental results show that this method can effectively extract the characteristics of Chinese characters and recognize Chinese characters.

  19. Comparisons of feature extraction algorithm based on unmanned aerial vehicle image

    Directory of Open Access Journals (Sweden)

    Xi Wenfei

    2017-07-01

    Full Text Available Feature point extraction technology has become a research hotspot in the photogrammetry and computer vision. The commonly used point feature extraction operators are SIFT operator, Forstner operator, Harris operator and Moravec operator, etc. With the high spatial resolution characteristics, UAV image is different from the traditional aviation image. Based on these characteristics of the unmanned aerial vehicle (UAV, this paper uses several operators referred above to extract feature points from the building images, grassland images, shrubbery images, and vegetable greenhouses images. Through the practical case analysis, the performance, advantages, disadvantages and adaptability of each algorithm are compared and analyzed by considering their speed and accuracy. Finally, the suggestions of how to adapt different algorithms in diverse environment are proposed.

  20. Lane Detection in Video-Based Intelligent Transportation Monitoring via Fast Extracting and Clustering of Vehicle Motion Trajectories

    Directory of Open Access Journals (Sweden)

    Jianqiang Ren

    2014-01-01

    Full Text Available Lane detection is a crucial process in video-based transportation monitoring system. This paper proposes a novel method to detect the lane center via rapid extraction and high accuracy clustering of vehicle motion trajectories. First, we use the activity map to realize automatically the extraction of road region, the calibration of dynamic camera, and the setting of three virtual detecting lines. Secondly, the three virtual detecting lines and a local background model with traffic flow feedback are used to extract and group vehicle feature points in unit of vehicle. Then, the feature point groups are described accurately by edge weighted dynamic graph and modified by a motion-similarity Kalman filter during the sparse feature point tracking. After obtaining the vehicle trajectories, a rough k-means incremental clustering with Hausdorff distance is designed to realize the rapid online extraction of lane center with high accuracy. The use of rough set reduces effectively the accuracy decrease, which results from the trajectories that run irregularly. Experimental results prove that the proposed method can detect lane center position efficiently, the affected time of subsequent tasks can be reduced obviously, and the safety of traffic surveillance systems can be enhanced significantly.

  1. Automatic Extraction of Urban Built-Up Area Based on Object-Oriented Method and Remote Sensing Data

    Science.gov (United States)

    Li, L.; Zhou, H.; Wen, Q.; Chen, T.; Guan, F.; Ren, B.; Yu, H.; Wang, Z.

    2018-04-01

    Built-up area marks the use of city construction land in the different periods of the development, the accurate extraction is the key to the studies of the changes of urban expansion. This paper studies the technology of automatic extraction of urban built-up area based on object-oriented method and remote sensing data, and realizes the automatic extraction of the main built-up area of the city, which saves the manpower cost greatly. First, the extraction of construction land based on object-oriented method, the main technical steps include: (1) Multi-resolution segmentation; (2) Feature Construction and Selection; (3) Information Extraction of Construction Land Based on Rule Set, The characteristic parameters used in the rule set mainly include the mean of the red band (Mean R), Normalized Difference Vegetation Index (NDVI), Ratio of residential index (RRI), Blue band mean (Mean B), Through the combination of the above characteristic parameters, the construction site information can be extracted. Based on the degree of adaptability, distance and area of the object domain, the urban built-up area can be quickly and accurately defined from the construction land information without depending on other data and expert knowledge to achieve the automatic extraction of the urban built-up area. In this paper, Beijing city as an experimental area for the technical methods of the experiment, the results show that: the city built-up area to achieve automatic extraction, boundary accuracy of 2359.65 m to meet the requirements. The automatic extraction of urban built-up area has strong practicality and can be applied to the monitoring of the change of the main built-up area of city.

  2. Validating Cross-Perspective Topic Modeling for Extracting Political Parties' Positions from Parliamentary Proceedings

    NARCIS (Netherlands)

    van der Zwaan, J.M.; Marx, M.; Kamps, J.; Kaminka, G.A.; Fox, M.; Bouquet, P.; Hüllermeyer, E.; Dignum, V.; Dignum, F.; van Harmelen, F.

    2016-01-01

    In the literature, different topic models have been introduced that target the task of viewpoint extraction. Because, generally, these studies do not present thorough validations of the models they introduce, it is not clear in advance which topic modeling technique will work best for our use case

  3. Optimization of microwave-assisted extraction of total extract, stevioside and rebaudioside-A from Stevia rebaudiana (Bertoni) leaves, using response surface methodology (RSM) and artificial neural network (ANN) modelling.

    Science.gov (United States)

    Ameer, Kashif; Bae, Seong-Woo; Jo, Yunhee; Lee, Hyun-Gyu; Ameer, Asif; Kwon, Joong-Ho

    2017-08-15

    Stevia rebaudiana (Bertoni) consists of stevioside and rebaudioside-A (Reb-A). We compared response surface methodology (RSM) and artificial neural network (ANN) modelling for their estimation and predictive capabilities in building effective models with maximum responses. A 5-level 3-factor central composite design was used to optimize microwave-assisted extraction (MAE) to obtain maximum yield of target responses as a function of extraction time (X 1 : 1-5min), ethanol concentration, (X 2 : 0-100%) and microwave power (X 3 : 40-200W). Maximum values of the three output parameters: 7.67% total extract yield, 19.58mg/g stevioside yield, and 15.3mg/g Reb-A yield, were obtained under optimum extraction conditions of 4min X 1 , 75% X 2 , and 160W X 3 . The ANN model demonstrated higher efficiency than did the RSM model. Hence, RSM can demonstrate interaction effects of inherent MAE parameters on target responses, whereas ANN can reliably model the MAE process with better predictive and estimation capabilities. Copyright © 2017. Published by Elsevier Ltd.

  4. The use of carrier RNA to enhance DNA extraction from microfluidic-based silica monoliths.

    Science.gov (United States)

    Shaw, Kirsty J; Thain, Lauren; Docker, Peter T; Dyer, Charlotte E; Greenman, John; Greenway, Gillian M; Haswell, Stephen J

    2009-10-12

    DNA extraction was carried out on silica-based monoliths within a microfluidic device. Solid-phase DNA extraction methodology was applied in which the DNA binds to silica in the presence of a chaotropic salt, such as guanidine hydrochloride, and is eluted in a low ionic strength solution, such as water. The addition of poly-A carrier RNA to the chaotropic salt solution resulted in a marked increase in the effective amount of DNA that could be recovered (25ng) compared to the absence of RNA (5ng) using the silica-based monolith. These findings confirm that techniques utilising nucleic acid carrier molecules can enhance DNA extraction methodologies in microfluidic applications.

  5. Model and simulation of a vacuum sieve tray for T extraction from liquid PbLi breeding blankets

    International Nuclear Information System (INIS)

    Mertens, M.A.J.; Demange, D.; Frances, L.

    2016-01-01

    Highlights: • A simulation tool was developed to analyse, optimise and scale up VST set-ups. • This tool predicts that efficiencies higher than 90% can be reached. • Upscaling to DEMO breeding blanket flow rates results in feasibly sized designs. - Abstract: Tritium self-sufficiency within a nuclear fusion reactor is necessary to demonstrate nuclear fusion as a viable source of energy. Tritium can be produced within liquid eutectic PbLi but then has to be extracted to be refuelled to the plasma. The vacuum sieve tray (VST) method is based on the extraction of tritium from millimetre-scaled oscillating PbLi droplets falling inside a vacuum chamber. A simulation tool was developed describing the fluid dynamics occurring along the PbLi flow and was used to study the influence of the different geometrical and operational parameters on the VST performance. The simulation predicts that extraction efficiencies over 90% can be easily reached according to theory and previous experimental results. The size of the VST extraction unit for a fusion reactor is estimated based on the findings from our single-nozzle model and assuming no T reabsorption. It is found to be in the feasible range. Nevertheless, two approaches are discussed which may further reduce this size by up to 90%. The simulation tool proved to be an easy and powerful way to analyse and optimise VST set-ups at any scale.

  6. The application of SVR model in the improvement of QbD: a case study of the extraction of podophyllotoxin.

    Science.gov (United States)

    Zhai, Chun-Hui; Xuan, Jian-Bang; Fan, Hai-Liu; Zhao, Teng-Fei; Jiang, Jian-Lan

    2018-05-03

    In order to make a further optimization of process design via increasing the stability of design space, we brought in the model of Support Vector Regression (SVR). In this work, the extraction of podophyllotoxin was researched as a case study based on Quality by Design (QbD). We compared the fitting effect of SVR and the most used quadratic polynomial model (QPM) in QbD, and an analysis was made between the two design spaces obtained by SVR and QPM. As a result, the SVR stayed ahead of QPM in prediction accuracy, the stability of model and the generalization ability. The introduction of SVR into QbD made the extraction process of podophyllotoxin well designed and easier to control. The better fitting effect of SVR improved the application effect of QbD and the universal applicability of SVR, especially for non-linear, complicated and weak-regularity problems, widened the application field of QbD.

  7. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  8. INFLUENCE OF HERBAL EXTRACTS ON METABOLIC DISTURBANCES IN DIABETES MELLITUS AND INSULIN RESISTANCE MODEL

    Directory of Open Access Journals (Sweden)

    T. V. Yakimova

    2015-01-01

    Full Text Available The aim of this research was to assess the influence on metabolic processes of herbal extracts, used in diets with different fat content, in diabetes mellitus and insulin resistance model.Material and methods. The experiments were performing on 90 noninbred male albino rats. Diabetes mellitus was modeling with twice-repeated intraperitoneal streptozotocine (30 mg/kg injections. For the insulin resistance formation animals were fad meal with 30% fat content. Against the background rats were administering into the stomach nettle leafs (Urtica dioica L., 100 mg/kg, burdock roots (Arctium lappa L., 25 mg/kg extracts or intraperitoneal insulin preparation Actrapide HM Penfill (3 mg/kg daily during 10 days. During period of agents introduction one-half of animals continued to receive food with high fat content, the other half received diet with 8% fat content. The third rats group received only food with low fat content without extracts or insulin administration. In blood was measured the glucose, glycosylated hemoglobin, creatinine, urea, uric acid content, in liver homogenates – glycogen, protein content, aminotransferases and glucose-6phosphatase activity, in muscle homogenates – glycogen and protein content.Results. After streptozotocine injections and diet with 30% fat content the blood glucose level became by 4.0–5.3 fold more than level of intact animals, increased the hemoglobin glycosylation, also creatinine, urea, uric acid blood content, in liver and muscle homogenates raised glycogen content, decreased protein quantity, in liver homogenates increased aminotranferases and glucose-6-phosphatase activity. In animals only feeding with 8% fat diminished hyperglycemia, creatinine blood retention, the liver glycogen content and recovered its protein resources. The nettle or burdock extracts administrating to animals that continued to receive high fat meal decreased the blood glucose, glycosylated hemoglobin and creatinine content, the liver

  9. A MODELLING APPROACH TO EXTRA VIRGIN OLIVE OIL EXTRACTION

    Directory of Open Access Journals (Sweden)

    Marco Daou

    2007-12-01

    Full Text Available In the present work is described a feasibility assessment for a new approach in virgin olive oil production control system. A predicting or simulating algorithm is implemented as artificial neural network based software, using literature found data concerning parameters related to olive grove, process, machine. Test and validation proved this tool is able to answer two different frequently asked questions by olive oil mill operators, using few agronomic and technological parameters with time and cost saving: – which quality level is up to oil extracted from defined olive lot following a defined process (predicting mode; – which process and machine parameters set would determine highest quality level for oil extracted from a defined olive lot (simulating mode.

  10. Novel Ontologies-based Optical Character Recognition-error Correction Cooperating with Graph Component Extraction

    Directory of Open Access Journals (Sweden)

    Sarunya Kanjanawattana

    2017-01-01

    Full Text Available literature. Extracting graph information clearly contributes to readers, who are interested in graph information interpretation, because we can obtain significant information presenting in the graph. A typical tool used to transform image-based characters to computer editable characters is optical character recognition (OCR. Unfortunately, OCR cannot guarantee perfect results, because it is sensitive to noise and input quality. This becomes a serious problem because misrecognition provides misunderstanding information to readers and causes misleading communication. In this study, we present a novel method for OCR-error correction based on bar graphs using semantics, such as ontologies and dependency parsing. Moreover, we used a graph component extraction proposed in our previous study to omit irrelevant parts from graph components. It was applied to clean and prepare input data for this OCR-error correction. The main objectives of this paper are to extract significant information from the graph using OCR and to correct OCR errors using semantics. As a result, our method provided remarkable performance with the highest accuracies and F-measures. Moreover, we examined that our input data contained less of noise because of an efficiency of our graph component extraction. Based on the evidence, we conclude that our solution to the OCR problem achieves the objectives.

  11. Distributed Classification of Localization Attacks in Sensor Networks Using Exchange-Based Feature Extraction and Classifier

    Directory of Open Access Journals (Sweden)

    Su-Zhe Wang

    2016-01-01

    Full Text Available Secure localization under different forms of attack has become an essential task in wireless sensor networks. Despite the significant research efforts in detecting the malicious nodes, the problem of localization attack type recognition has not yet been well addressed. Motivated by this concern, we propose a novel exchange-based attack classification algorithm. This is achieved by a distributed expectation maximization extractor integrated with the PECPR-MKSVM classifier. First, the mixed distribution features based on the probabilistic modeling are extracted using a distributed expectation maximization algorithm. After feature extraction, by introducing the theory from support vector machine, an extensive contractive Peaceman-Rachford splitting method is derived to build the distributed classifier that diffuses the iteration calculation among neighbor sensors. To verify the efficiency of the distributed recognition scheme, four groups of experiments were carried out under various conditions. The average success rate of the proposed classification algorithm obtained in the presented experiments for external attacks is excellent and has achieved about 93.9% in some cases. These testing results demonstrate that the proposed algorithm can produce much greater recognition rate, and it can be also more robust and efficient even in the presence of excessive malicious scenario.

  12. A Continuous-Exchange Cell-Free Protein Synthesis System Based on Extracts from Cultured Insect Cells

    Science.gov (United States)

    Stech, Marlitt; Quast, Robert B.; Sachse, Rita; Schulze, Corina; Wüstenhagen, Doreen A.; Kubick, Stefan

    2014-01-01

    In this study, we present a novel technique for the synthesis of complex prokaryotic and eukaryotic proteins by using a continuous-exchange cell-free (CECF) protein synthesis system based on extracts from cultured insect cells. Our approach consists of two basic elements: First, protein synthesis is performed in insect cell lysates which harbor endogenous microsomal vesicles, enabling a translocation of de novo synthesized target proteins into the lumen of the insect vesicles or, in the case of membrane proteins, their embedding into a natural membrane scaffold. Second, cell-free reactions are performed in a two chamber dialysis device for 48 h. The combination of the eukaryotic cell-free translation system based on insect cell extracts and the CECF translation system results in significantly prolonged reaction life times and increased protein yields compared to conventional batch reactions. In this context, we demonstrate the synthesis of various representative model proteins, among them cytosolic proteins, pharmacological relevant membrane proteins and glycosylated proteins in an endotoxin-free environment. Furthermore, the cell-free system used in this study is well-suited for the synthesis of biologically active tissue-type-plasminogen activator, a complex eukaryotic protein harboring multiple disulfide bonds. PMID:24804975

  13. Protective effects of seahorse extracts in a rat castration and testosterone-induced benign prostatic hyperplasia model and mouse oligospermatism model.

    Science.gov (United States)

    Xu, Dong-Hui; Wang, Li-Hong; Mei, Xue-Ting; Li, Bing-Ji; Lv, Jun-Li; Xu, Shi-Bo

    2014-03-01

    This study investigated the effects of seahorse (Hippocampus spp.) extracts in a rat model of benign prostatic hyperplasia (BPH) and mouse model of oligospermatism. Compared to the sham operated group, castration and testosterone induced BPH, indicated by increased penile erection latency; decreased penis nitric oxide synthase (NOS) activity; reduced serum acid phosphatase (ACP) activity; increased prostate index; and epithelial thickening, increased glandular perimeter, increased proliferating cell nuclear antigen (PCNA) index and upregulation of basic fibroblast growth factor (bFGF) in the prostate. Seahorse extracts significantly ameliorated the histopathological changes associated with BPH, reduced the latency of penile erection and increased penile NOS activity. Administration of seahorse extracts also reversed epididymal sperm viability and motility in mice treated with cyclophosphamide (CP). Seahorse extracts have potential as a candidate marine drug for treating BPH without inducing the side effects of erectile dysfunction (ED) or oligospermatism associated with the BPH drug finasteride. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Quantitative extraction of the bedrock exposure rate based on unmanned aerial vehicle data and Landsat-8 OLI image in a karst environment

    Science.gov (United States)

    Wang, Hongyan; Li, Qiangzi; Du, Xin; Zhao, Longcai

    2017-12-01

    In the karst regions of southwest China, rocky desertification is one of the most serious problems in land degradation. The bedrock exposure rate is an important index to assess the degree of rocky desertification in karst regions. Because of the inherent merits of macro-scale, frequency, efficiency, and synthesis, remote sensing is a promising method to monitor and assess karst rocky desertification on a large scale. However, actual measurement of the bedrock exposure rate is difficult and existing remote-sensing methods cannot directly be exploited to extract the bedrock exposure rate owing to the high complexity and heterogeneity of karst environments. Therefore, using unmanned aerial vehicle (UAV) and Landsat-8 Operational Land Imager (OLI) data for Xingren County, Guizhou Province, quantitative extraction of the bedrock exposure rate based on multi-scale remote-sensing data was developed. Firstly, we used an object-oriented method to carry out accurate classification of UAVimages. From the results of rock extraction, the bedrock exposure rate was calculated at the 30 m grid scale. Parts of the calculated samples were used as training data; other data were used for model validation. Secondly, in each grid the band reflectivity of Landsat-8 OLI data was extracted and a variety of rock and vegetation indexes (e.g., NDVI and SAVI) were calculated. Finally, a network model was established to extract the bedrock exposure rate. The correlation coefficient of the network model was 0.855, that of the validation model was 0.677 and the root mean square error of the validation model was 0.073. This method is valuable for wide-scale estimation of bedrock exposure rate in karst environments. Using the quantitative inversion model, a distribution map of the bedrock exposure rate in Xingren County was obtained.

  15. Solvent-free microwave extraction of essential oil from Melaleuca leucadendra L.

    Directory of Open Access Journals (Sweden)

    Widya Ismanto Aviarina

    2018-01-01

    Full Text Available Cajuput (Melaleuca leucadendra L. oil is one of potential commodity that provides an important role for the country’s foreign exchange but the extraction of these essential oil is still using conventional method such as hydrodistillation which takes a long time to produce essential oil with good quality. Therefore it is necessary to optimize the extraction process using a more effective and efficient method. So in this study the extraction is done using solvent-free microwave extraction method that are considered more effective and efficient than conventional methods. The optimum yield in the extraction of cajuput oil using solvent-free microwave extraction method is 1.0674%. The optimum yield is obtained on the feed to distiller (F/D ratio of 0.12 g/mL with microwave power of 400 W. In the extraction of cajuput oil using solvent-free microwave extraction method is performed first-order and second-order kinetics modelling. Based on kinetics modelling that has been done, it can be said that the second-order kinetic model (R2 = 0.9901 can be better represent experimental results of extraction of cajuput oil that using solvent-free microwave extraction method when compared with the first-order kinetic model (R2 = 0.9854.

  16. Optical Sensing Material for pH Detection based on the Use of Roselle Extract

    International Nuclear Information System (INIS)

    Nurul Huda Abd Karim; Musa Ahmad; Mohammad Osman Herman; Ahmad Mahir Mokhtar

    2008-01-01

    This research assessed the potential of natural colour extract of Hibiscus Sabdariffa L. (roselle) as sensing material.The pH sensor was developed based on the use of natural reddish colour in roselle calyx, delphinidin-3-sambubioside immobilised in a glass fibre filter paper. In free solution, roselle extract was characterised by using UV-visible spectrophotometer to study the effect of pH, extract concentration, response time, repeatability and photo stability. The study showed that natural colour extract can be used as sensing material for the development of an optical pH sensor. (author)

  17. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  18. Orthosiphon stamineus Leaf Extract Affects TNF-α and Seizures in a Zebrafish Model

    Directory of Open Access Journals (Sweden)

    Brandon Kar Meng Choo

    2018-02-01

    Full Text Available Epileptic seizures result from abnormal brain activity and can affect motor, autonomic and sensory function; as well as, memory, cognition, behavior, or emotional state. Effective anti-epileptic drugs (AEDs are available but have tolerability issues due to their side effects. The Malaysian herb Orthosiphon stamineus, is a traditional epilepsy remedy and possesses anti-inflammatory, anti-oxidant and free-radical scavenging abilities, all of which are known to protect against seizures. This experiment thus aimed to explore if an ethanolic leaf extract of O. stamineus has the potential to be a novel symptomatic treatment for epileptic seizures in a zebrafish model; and the effects of the extract on the expression levels of several genes in the zebrafish brain which are associated with seizures. The results of this study indicate that O. stamineus has the potential to be a novel symptomatic treatment for epileptic seizures as it is pharmacologically active against seizures in a zebrafish model. The anti-convulsive effect of this extract is also comparable to that of diazepam at higher doses and can surpass diazepam in certain cases. Treatment with the extract also counteracts the upregulation of NF-κB, NPY and TNF-α as a result of a Pentylenetetrazol (PTZ treated seizure. The anti-convulsive action for this extract could be at least partially due to its downregulation of TNF-α. Future work could include the discovery of the active anti-convulsive compound, as well as determine if the extract does not cause cognitive impairment in zebrafish.

  19. Investigations into the Reusability of Amidoxime-Based Polymeric Adsorbents for Seawater Uranium Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Kuo, Li-Jung [Marine; Pan, Horng-Bin [Department; Wai, Chien M. [Department; Byers, Margaret F. [Nuclear; Schneider, Erich [Nuclear; Strivens, Jonathan E. [Marine; Janke, Christopher J. [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831, United States; Das, Sadananda [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831, United States; Mayes, Richard T. [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831, United States; Wood, Jordana R. [Marine; Schlafer, Nicholas [Marine; Gill, Gary A. [Marine

    2017-09-29

    The ability to re-use amidoxime-based polymeric adsorbents is a critical component in reducing the overall cost of the technology to extract uranium from seawater. This report describes an evaluation of adsorbent reusability in multiple re-use (adsorption/stripping) cycles in real seawater exposures with potassium bicarbonate (KHCO3) elution using several amidoxime-based polymeric adsorbents. The KHCO3 elution technique achieved ~100% recovery of uranium adsorption capacity in the first re-use. Subsequent re-uses showed significant drops in adsorption capacity. After the 4th re-use with the ORNL AI8 adsorbent, the 56-day adsorption capacity dropped to 28% of its original capacity. FTIR spectra revealed that there was a conversion of the amidoxime ligands to carboxylate groups during extended seawater exposure, becoming more significant with longer the exposure time. Ca and Mg adsorption capacities also increased with each re-use cycle supporting the hypothesis that long term exposure resulted in converting amidoxime to carboxylate, enhancing the adsorption of Ca and Mg. Shorter seawater exposure (adsorption/stripping) cycles (28 vs. 42 days) had higher adsorption capacities after re-use, but the shorter exposure cycle time did not produce an overall better performance in terms of cumulative exposure time. Recovery of uranium capacity in re-uses may also vary across different adsorbent formulations. Through multiple re-use the adsorbent AI8 can harvest 10 g uranium/kg adsorbent in ~140 days, using a 28-day adsorption/stripping cycle, a performance much better than would be achieved with a single use of the adsorbent through very long-term exposure (saturation capacity = 7.4 g U/kg adsorbent). A time dependent seawater exposure model to evaluate the cost associated with reusing amidoxime-based adsorbents in real seawater exposures was developed. The cost to extract uranium from seawater ranged from $610-830/kg U was predicted. Model simulation suggests that a short

  20. Extraction optimization and pixel-based chemometric analysis of semi-volatile organic compounds in groundwater

    DEFF Research Database (Denmark)

    Christensen, Peter; Tomasi, Giorgio; Kristensen, Mette

    2017-01-01

    . In this study, we tested the combination of solid phase extraction (SPE) with dispersive liquid-liquid micro extraction (DLLME), or with stir bar sorptive extraction (SBSE), as an extraction method for semi-VOCs in groundwater. Combining SPE with DLLME or SBSE resulted in better separation of peaks...... in an unresolved complex mixture. SPE-DLLME was chosen as the preferred extraction method. SPE-DLLME covered a larger polarity range (logKo/w 2.0-11.2), had higher extraction efficiency at logKo/w 2.0-3.8 and 5.8-11.2, and was faster compared to SPE-SBSE. SPE-DLLME extraction combined with chemical analysis by gas...... chromatography-mass spectrometry (GC-MS) and pixel-based data analysis of summed extraction ion chromatograms (sEICs) was tested as a new method for chemical fingerprinting of semi-VOCs in 15 groundwater samples. The results demonstrate that SPE-DLLME-GC-MS provides an excellent compromise between compound...

  1. Validation of a DNA IQ-based extraction method for TECAN robotic liquid handling workstations for processing casework.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Fourney, Ron M

    2010-10-01

    A semi-automated DNA extraction process for casework samples based on the Promega DNA IQ™ system was optimized and validated on TECAN Genesis 150/8 and Freedom EVO robotic liquid handling stations configured with fixed tips and a TECAN TE-Shake™ unit. The use of an orbital shaker during the extraction process promoted efficiency with respect to DNA capture, magnetic bead/DNA complex washes and DNA elution. Validation studies determined the reliability and limitations of this shaker-based process. Reproducibility with regards to DNA yields for the tested robotic workstations proved to be excellent and not significantly different than that offered by the manual phenol/chloroform extraction. DNA extraction of animal:human blood mixtures contaminated with soil demonstrated that a human profile was detectable even in the presence of abundant animal blood. For exhibits containing small amounts of biological material, concordance studies confirmed that DNA yields for this shaker-based extraction process are equivalent or greater to those observed with phenol/chloroform extraction as well as our original validated automated magnetic bead percolation-based extraction process. Our data further supports the increasing use of robotics for the processing of casework samples. Crown Copyright © 2009. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Kernel-based discriminant feature extraction using a representative dataset

    Science.gov (United States)

    Li, Honglin; Sancho Gomez, Jose-Luis; Ahalt, Stanley C.

    2002-07-01

    Discriminant Feature Extraction (DFE) is widely recognized as an important pre-processing step in classification applications. Most DFE algorithms are linear and thus can only explore the linear discriminant information among the different classes. Recently, there has been several promising attempts to develop nonlinear DFE algorithms, among which is Kernel-based Feature Extraction (KFE). The efficacy of KFE has been experimentally verified by both synthetic data and real problems. However, KFE has some known limitations. First, KFE does not work well for strongly overlapped data. Second, KFE employs all of the training set samples during the feature extraction phase, which can result in significant computation when applied to very large datasets. Finally, KFE can result in overfitting. In this paper, we propose a substantial improvement to KFE that overcomes the above limitations by using a representative dataset, which consists of critical points that are generated from data-editing techniques and centroid points that are determined by using the Frequency Sensitive Competitive Learning (FSCL) algorithm. Experiments show that this new KFE algorithm performs well on significantly overlapped datasets, and it also reduces computational complexity. Further, by controlling the number of centroids, the overfitting problem can be effectively alleviated.

  3. MATHEMATICAL MODELING AND SIMULATION OF SUPERCRITICAL CO2 EXTRACTION OF ZIZIPHORA TENUIOR VOLATILES

    Directory of Open Access Journals (Sweden)

    Bizhan Honarvar

    2016-01-01

    Full Text Available Ziziphora Tenuior is an edible medicinal plant which belongs to Labiatae family. It is often used as a treatment for some diseases such as edema, insomnia, and hypertension in Turkey, Iran and China. The main components of the Ziziphora Tenuior essential oil are p-mentha-3-en-8-ol and pulegone. In this study, the extractions of Ziziphora essential oil has been described by a two-dimensional mathematical model, and the effects of some extraction parameter variations on the extraction yield have been examined. Amongst the said parameters were fluid flow rate, extractor diameter and length and mean particle size.

  4. Feature extraction algorithm for space targets based on fractal theory

    Science.gov (United States)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  5. A graph-Laplacian-based feature extraction algorithm for neural spike sorting.

    Science.gov (United States)

    Ghanbari, Yasser; Spence, Larry; Papamichalis, Panos

    2009-01-01

    Analysis of extracellular neural spike recordings is highly dependent upon the accuracy of neural waveform classification, commonly referred to as spike sorting. Feature extraction is an important stage of this process because it can limit the quality of clustering which is performed in the feature space. This paper proposes a new feature extraction method (which we call Graph Laplacian Features, GLF) based on minimizing the graph Laplacian and maximizing the weighted variance. The algorithm is compared with Principal Components Analysis (PCA, the most commonly-used feature extraction method) using simulated neural data. The results show that the proposed algorithm produces more compact and well-separated clusters compared to PCA. As an added benefit, tentative cluster centers are output which can be used to initialize a subsequent clustering stage.

  6. Extraction of Terraces on the Loess Plateau from High-Resolution DEMs and Imagery Utilizing Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Hanqing Zhao

    2017-05-01

    Full Text Available Abstract: Terraces are typical artificial landforms on the Loess Plateau, with ecological functions in water and soil conservation, agricultural production, and biodiversity. Recording the spatial distribution of terraces is the basis of monitoring their extent and understanding their ecological effects. The current terrace extraction method mainly relies on high-resolution imagery, but its accuracy is limited due to vegetation coverage distorting the features of terraces in imagery. High-resolution topographic data reflecting the morphology of true terrace surfaces are needed. Terraces extraction on the Loess Plateau is challenging because of the complex terrain and diverse vegetation after the implementation of “vegetation recovery”. This study presents an automatic method of extracting terraces based on 1 m resolution digital elevation models (DEMs and 0.3 m resolution Worldview-3 imagery as auxiliary information used for object-based image analysis (OBIA. A multi-resolution segmentation method was used where slope, positive and negative terrain index (PN, accumulative curvature slope (AC, and slope of slope (SOS were determined as input layers for image segmentation by correlation analysis and Sheffield entropy method. The main classification features based on DEMs were chosen from the terrain features derived from terrain factors and texture features by gray-level co-occurrence matrix (GLCM analysis; subsequently, these features were determined by the importance analysis on classification and regression tree (CART analysis. Extraction rules based on DEMs were generated from the classification features with a total classification accuracy of 89.96%. The red band and near-infrared band of images were used to exclude construction land, which is easily confused with small-size terraces. As a result, the total classification accuracy was increased to 94%. The proposed method ensures comprehensive consideration of terrain, texture, shape, and

  7. Image feature extraction based on the camouflage effectiveness evaluation

    Science.gov (United States)

    Yuan, Xin; Lv, Xuliang; Li, Ling; Wang, Xinzhu; Zhang, Zhi

    2018-04-01

    The key step of camouflage effectiveness evaluation is how to combine the human visual physiological features, psychological features to select effectively evaluation indexes. Based on the predecessors' camo comprehensive evaluation method, this paper chooses the suitable indexes combining with the image quality awareness, and optimizes those indexes combining with human subjective perception. Thus, it perfects the theory of index extraction.

  8. Antinociceptive and anti-inflammatory effects of Urtica dioica leaf extract in animal models.

    Science.gov (United States)

    Hajhashemi, Valiollah; Klooshani, Vahid

    2013-01-01

    This study was aimed to examine the antinociceptive and anti-inflammatory effects of Urtica dioica leaf extract in animal models. Hydroalcoholic extract of the plant leaves was prepared by percolation method. Male Swiss mice (25-35 g) and male Wistar rats (180-200 g) were randomly distributed in control, standard drug, and three experimental groups (n=6 in each group). Acetic acid-induced writhing, formalin test, and carrageenan-induced paw edema were used to assess the antinociceptive and anti-inflammatory effects. The extract dose-dependently reduced acetic acid-induced abdominal twitches. In formalin test, the extract at any of applied doses (100, 200, and 400 mg/kg) could not suppress the licking behavior of first phase while doses of 200 and 400 mg/kg significantly inhibited the second phase of formalin test. In carrageenan test, the extract at a dose of 400 mg/kg significantly inhibited the paw edema by 26%. The results confirm the folkloric use of the plant extract in painful and inflammatory conditions. Further studies are needed to characterize the active constituents and the mechanism of action of the plant extract.

  9. Fatigue Feature Extraction Analysis based on a K-Means Clustering Approach

    Directory of Open Access Journals (Sweden)

    M.F.M. Yunoh

    2015-06-01

    Full Text Available This paper focuses on clustering analysis using a K-means approach for fatigue feature dataset extraction. The aim of this study is to group the dataset as closely as possible (homogeneity for the scattered dataset. Kurtosis, the wavelet-based energy coefficient and fatigue damage are calculated for all segments after the extraction process using wavelet transform. Kurtosis, the wavelet-based energy coefficient and fatigue damage are used as input data for the K-means clustering approach. K-means clustering calculates the average distance of each group from the centroid and gives the objective function values. Based on the results, maximum values of the objective function can be seen in the two centroid clusters, with a value of 11.58. The minimum objective function value is found at 8.06 for five centroid clusters. It can be seen that the objective function with the lowest value for the number of clusters is equal to five; which is therefore the best cluster for the dataset.

  10. Evaluation of DNA extraction methods for PCR-based detection of Listeria monocytogenes from vegetables.

    Science.gov (United States)

    Vojkovska, H; Kubikova, I; Kralik, P

    2015-03-01

    Epidemiological data indicate that raw vegetables are associated with outbreaks of Listeria monocytogenes. Therefore, there is a demand for the availability of rapid and sensitive methods, such as PCR assays, for the detection and accurate discrimination of L. monocytogenes. However, the efficiency of PCR methods can be negatively affected by inhibitory compounds commonly found in vegetable matrices that may cause false-negative results. Therefore, the sample processing and DNA isolation steps must be carefully evaluated prior to the introduction of such methods into routine practice. In this study, we compared the ability of three column-based and four magnetic bead-based commercial DNA isolation kits to extract DNA of the model micro-organism L. monocytogenes from raw vegetables. The DNA isolation efficiency of all isolation kits was determined using a triplex real-time qPCR assay designed to specifically detect L. monocytogenes. The kit with best performance, the PowerSoil(™) Microbial DNA Isolation Kit, is suitable for the extraction of amplifiable DNA from L. monocytogenes cells in vegetable with efficiencies ranging between 29.6 and 70.3%. Coupled with the triplex real-time qPCR assay, this DNA isolation kit is applicable to the samples with bacterial loads of 10(3) bacterial cells per gram of L. monocytogenes. Several recent outbreaks of Listeria monocytogenes have been associated with the consumption of fruits and vegetables. Real-time PCR assays allow fast detection and accurate quantification of microbes. However, the success of real-time PCR is dependent on the success with which template DNA can be extracted. The results of this study suggest that the PowerSoil(™) Microbial DNA Isolation Kit can be used for the extraction of amplifiable DNA from L. monocytogenes cells in vegetable with efficiencies ranging between 29.6 and 70.3%. This method is applicable to samples with bacterial loads of 10(3) bacterial cells per gram of L. monocytogenes. © 2014

  11. A Metabolomics-Based Strategy for the Mechanism Exploration of Traditional Chinese Medicine: Descurainia sophia Seeds Extract and Fractions as a Case Study

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2017-01-01

    Full Text Available A UPLC-QTOF-MS based metabolomics research was conducted to explore potential biomarkers which would increase our understanding of the model and to assess the integral efficacy of Descurainia sophia seeds extract (DS-A. Additionally, DS-A was split into five fractions in descending order of polarity, which were utilized to illustrate the mechanism together. The 26 identified biomarkers were mainly related to disturbances in phenylalanine, tyrosine, tryptophan, purine, arginine, and proline metabolism. Furthermore, heat map, hierarchical cluster analysis (HCA, and correlation network diagram of biomarkers perturbed by modeling were all conducted. The results of heat map and HCA suggested that fat oil fraction could reverse the abnormal metabolism in the model to some extent; meanwhile the metabolic inhibitory effect produced by the other four fractions helped to relieve cardiac load and compensate the insufficient energy supplement induced by the existing heart and lung injury in model rats. Briefly, the split fractions interfered with the model from different aspects and ultimately constituted the overall effects of extract. In conclusion, the metabolomics method, combined with split fractions of extract, is a powerful approach for illustrating pathologic changes of Chinese medicine syndrome and action mechanisms of traditional Chinese medicine.

  12. Hardware Prototyping of Neural Network based Fetal Electrocardiogram Extraction

    Science.gov (United States)

    Hasan, M. A.; Reaz, M. B. I.

    2012-01-01

    The aim of this paper is to model the algorithm for Fetal ECG (FECG) extraction from composite abdominal ECG (AECG) using VHDL (Very High Speed Integrated Circuit Hardware Description Language) for FPGA (Field Programmable Gate Array) implementation. Artificial Neural Network that provides efficient and effective ways of separating FECG signal from composite AECG signal has been designed. The proposed method gives an accuracy of 93.7% for R-peak detection in FHR monitoring. The designed VHDL model is synthesized and fitted into Altera's Stratix II EP2S15F484C3 using the Quartus II version 8.0 Web Edition for FPGA implementation.

  13. Designing a Knowledge Management Excellence Model Based on Interpretive Structural Modeling

    Directory of Open Access Journals (Sweden)

    Mirza Hassan Hosseini

    2014-09-01

    Full Text Available Despite the development of appropriate academic and experiential background knowledge management and its manifestation as a competitive advantage, many organizations have failed in its effective utilization. Among the reasons for this failure are some deficiencies in terms of methodology in inappropriate recognition and translation of KM dimensions and lack of systematic approach in establishment of causal relationships among KM factors. This article attempts to design an Organizational Knowledge Management Excellence Model. To design an organizational knowledge management excellence model based on library researches, interviews with experts and interpretive-structural modeling (ISM was used in order to identify and determine the relationships between the factors of km excellence. Accordingly, 9 key criteria of KM Excellence as well as 29 sub-criteria were extracted and the relationships and sequence of factors were defined and developed in 5 levels for designing an organizational KM excellence Model. Finally, the concepts were applied in Defense Organizations to illustrate the proposed methodology.

  14. Mass transfer and kinetic modelling of supercritical CO 2 extraction of fresh tea leaves (Camellia sinensis L.

    Directory of Open Access Journals (Sweden)

    Pravin Vasantrao Gadkari

    Full Text Available Abstract Supercritical carbon dioxide extraction was employed to extract solids from fresh tea leaves (Camellia sinensis L. at various pressures(15 to 35 MPa and temperatures (313 to 333K with addition of ethanol as a polarity modifier. The diffusion model and Langmuir model fit well to experimental data and the correlation coefficients were greater than 0.94. Caffeine solubility was determined in supercritical CO2 and the Gordillo model was employed to correlate the experimental solubility values. The Gordillo model fit well to the experimental values with a correlation coefficient 0.91 and 8.91% average absolute relative deviation. Total phenol content of spent materials varied from 57 to 85.2 mg of gallic acid equivalent per g spent material, total flavonoid content varied from 50.4 to 58.2 mg of rutin equivalent per g spent material and the IC50 value (antioxidant content varied from 27.20 to 38.11 µg of extract per mL. There was significant reduction in polyphenol, flavonoid and antioxidant content in the extract when supercritical CO2 extraction was carried out at a higher pressure of 35 MPa.

  15. Acute administration of ginger (Zingiber officinale rhizomes) extract on timed intravenous pentylenetetrazol infusion seizure model in mice.

    Science.gov (United States)

    Hosseini, Abdolkarim; Mirazi, Naser

    2014-03-01

    Zingiber officinale (Zingiberaceae) or ginger, which is used in traditional medicine has antioxidant activity and neuroprotective effects. The effects of this plant on clonic seizure have not yet been studied. The present study evaluated the anticonvulsant effect of ginger in a model of clonic seizures induced with pentylenetetrazole (PTZ) in male mice. The anticonvulsant effect of Z. officinale was investigated using i.v. PTZ-induced seizure models in mice. Different doses of the hydroethanolic extract of Z. officinale (25, 50, and 100mg/kg) were administered intraperitonal (i.p.), 2 and 24h before induction of PTZ. Phenobarbital sodium (30mg/kg), a reference standard, was also tested for comparison. The effect of ginger on to the appearance of three separate seizure endpoints (myoclonic, generalized clonus and forelimb tonic extension phase) was recorded. The results showed that the ginger extract has anticonvulsant effects in all the experimental treatment groups of seizure tested as it significantly increased the seizure threshold. Hydroethanolic extract of Z. officinale significantly increased the onset time of myoclonic seizure at doses of 25-100mg/kg (p<0.001) and significantly prevented generalized clonic (p<0.001) and increased the threshold for the forelimb tonic extension (p<0.01) seizure 2 and 24h before induction of PTZ compared with control group. Based on the results the hydroethanolic extract of ginger has anticonvulsant effects, possibly through an interaction with inhibitory and excitatory system, antioxidant mechanisms, oxidative stress and calcium channel inhibition. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Synergistic solvent extraction of Eu(III) and Tb(III) with mixtures of various organophosphorus extractants

    International Nuclear Information System (INIS)

    Reddy, B.V.; Reddy, L.K.; Reddy, A.S.

    1994-01-01

    Synergistic solvent extraction of Eu(III) and Tb(III) from thiocyanate solutions with mixtures of 2-ethylhexylphosphonic acid mono-2-ethylhexyl ester (EHPNA) and di-2-ethylhexylphosphoric acid (DEHPA) or tributyl phosphate (TBP) or trioctylphosphine oxide (TOPO) or triphenylphosphine oxide (TPhPO) in benzene has been studied. The mechanism of extraction can be explained by a simple chemically based model. The equilibrium constants of the mixed-ligand species of the various neutral donors have been determined by non-linear regression analysis. (author) 13 refs.; 9 figs.; 2 tabs

  17. Modeling the Daly Gap: The Influence of Latent Heat Production in Controlling Magma Extraction and Eruption

    Science.gov (United States)

    Nelson, B. K.; Ghiorso, M. S.; Bachmann, O.; Dufek, J.

    2011-12-01

    A century-old issue in volcanology is the origin of the gap in chemical compositions observed in magmatic series on ocean islands and arcs - the "Daly Gap". If the gap forms during differentiation from a mafic parent, models that predict the dynamics of magma extraction as a function of chemical composition must simulate a process that results in volumetrically biased, bimodal compositions of erupted magmas. The probability of magma extraction is controlled by magma dynamical processes, which have a complex response to magmatic heat evolution. Heat loss from the magmatic system is far from a simple, monotonic function of time. It is modified by the crystallization sequence, chamber margin heat flux, and is buffered by latent heat production. We use chemical and thermal calculations of MELTS (Ghiorso & Sack, 1995) as input to the physical model of QUANTUM (Dufek & Bachmann, 2010) to predict crystallinity windows of most probable magma extraction. We modeled two case studies: volcanism on Tenerife, Canary Islands, and the Campanian Ignimbrite (CI) of Campi Flegrei, Italy. Both preserve a basanitic to phonolitic lineage and have comparable total alkali concentrations; however, CI has high and Tenerife has low K2O/Na2O. Modeled thermal histories of differentiation for the two sequences contrast strongly. In Tenerife, the rate of latent heat production is almost always greater than sensible heat production, with spikes in the ratio of latent to sensible heats of up to 40 associated with the appearance of Fe-Ti oxides at near 50% crystallization. This punctuated heat production must cause magma temperature change to stall or slow in time. The extended time spent at ≈50% crystallinity, associated with dynamical processes that enhance melt extraction near 50% crystallinity, suggests the magma composition at this interval should be common. In Tenerife, the modeled composition coincides with that of the first peak in the bimodal frequency-composition distribution. In our

  18. 3d Finite Element Modelling of Non-Crimp Fabric Based Fibre Composite Based on X-Ray Ct Data

    DEFF Research Database (Denmark)

    Jespersen, Kristine Munk; Asp, Leif; Mikkelsen, Lars Pilgaard

    2017-01-01

    initiation and progression in the material. In the current study, the real bundle structure inside a non-crimp fabric based fibre composite is extracted from 3D X-ray CT images and imported into ABAQUS for numerical modelling.The local stress concentrations when loaded in tension caused by the fibre bundle...

  19. Assessment of antidiarrhoeal activity of the methanol extract of Xylocarpus granatum bark in mice model.

    Science.gov (United States)

    Rouf, Razina; Uddin, Shaikh Jamal; Shilpi, Jamil Ahmad; Alamgir, Mahiuddin

    2007-02-12

    The methanol extract of Xylocarpus granatum bark was studied for its antidiarrhoeal properties in experimental diarrhoea, induced by castor oil and magnesium sulphate in mice. At the doses of 250 and 500 mg/kg per oral, the methanol extract showed significant and dose-dependent antidiarrhoeal activity in both models. The extracts also significantly reduced the intestinal transit in charcoal meal test when compared to atropine sulphate (5 mg/kg; i.m.). The results showed that the extracts of Xylocarpus granatum bark have a significant antidiarrhoeal activity and supports its traditional uses in herbal medicine.

  20. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  1. Composition and color stability of anthocyanin-based extract from purple sweet potato

    Directory of Open Access Journals (Sweden)

    Xiu-li He

    2015-09-01

    Full Text Available AbstractPurple sweet potato (PSP can provide products with attractive color besides nutritious benefits in food processing. So, the compositions and color stability of an aqueous anthocyanin-based PSP extract were investigated in order to promote its wide use in food industry. PSP anthocyanins were extracted with water, and nine individual anthocyanins (48.72 ug mL–1 in total, 24.36 mg/100 g fresh PSP in yield were found by HPLC analysis. The PSP extract also contained 17.11 mg mL–1 of protein, 0.44 mg mL–1 of dietary fiber, 2.82 mg mL–1 of reducing sugars, 4.02 ug mL–1 of Se, 54.21 ug mL–1 of Ca and 60.83 ug mL–1 of Mg. Changes in color and stability of the PSP extract, as affected by pH, heat, light and extraction process, were further evaluated. Results indicated that PSP anthocyanins had good stability at pH 2.0-6.0, while the color of PSP extract kept stable during 30 days of storage at 20 °C in dark. Both UV and fluorescent exposure weakened the color stability of PSP extract and UV showed a more drastic effect in comparison. A steaming pretreatment of fresh PSP is beneficial to the color stability.

  2. Extracting Vegetation Coverage in Dry-hot Valley Regions Based on Alternating Angle Minimum Algorithm

    Science.gov (United States)

    Y Yang, M.; Wang, J.; Zhang, Q.

    2017-07-01

    Vegetation coverage is one of the most important indicators for ecological environment change, and is also an effective index for the assessment of land degradation and desertification. The dry-hot valley regions have sparse surface vegetation, and the spectral information about the vegetation in such regions usually has a weak representation in remote sensing, so there are considerable limitations for applying the commonly-used vegetation index method to calculate the vegetation coverage in the dry-hot valley regions. Therefore, in this paper, Alternating Angle Minimum (AAM) algorithm of deterministic model is adopted for selective endmember for pixel unmixing of MODIS image in order to extract the vegetation coverage, and accuracy test is carried out by the use of the Landsat TM image over the same period. As shown by the results, in the dry-hot valley regions with sparse vegetation, AAM model has a high unmixing accuracy, and the extracted vegetation coverage is close to the actual situation, so it is promising to apply the AAM model to the extraction of vegetation coverage in the dry-hot valley regions.

  3. Neuroprotective Effects of Herbal Extract (Rosa canina, Tanacetum vulgare and Urtica dioica) on Rat Model of Sporadic Alzheimer's Disease.

    Science.gov (United States)

    Daneshmand, Parvaneh; Saliminejad, Kioomars; Dehghan Shasaltaneh, Marzieh; Kamali, Koorosh; Riazi, Gholam Hossein; Nazari, Reza; Azimzadeh, Pedram; Khorram Khorshid, Hamid Reza

    2016-01-01

    Sporadic Alzheimer's Disease (SAD) is caused by genetic risk factors, aging and oxidative stresses. The herbal extract of Rosa canina (R. canina), Tanacetum vulgare (T. vulgare) and Urtica dioica (U. dioica) has a beneficial role in aging, as an anti-inflammatory and anti-oxidative agent. In this study, the neuroprotective effects of this herbal extract in the rat model of SAD was investigated. The rats were divided into control, sham, model, herbal extract -treated and ethanol-treated groups. Drug interventions were started on the 21(st) day after modeling and each treatment group was given the drugs by intraperitoneal (I.P.) route for 21 days. The expression levels of the five important genes for pathogenesis of SAD including Syp, Psen1, Mapk3, Map2 and Tnf-α were measured by qPCR between the hippocampi of SAD model which were treated by this herbal extract and control groups. The Morris Water Maze was adapted to test spatial learning and memory ability of the rats. Treatment of the rat model of SAD with herbal extract induced a significant change in expression of Syp (p=0.001) and Psen1 (p=0.029). In Morris Water Maze, significant changes in spatial learning seen in the rat model group were improved in herbal-treated group. This herbal extract could have anti-dementia properties and improve spatial learning and memory in SAD rat model.

  4. An Integrated Hydrologic Model and Remote Sensing Synthesis Approach to Study Groundwater Extraction During a Historic Drought in the California Central Valley

    Science.gov (United States)

    Thatch, L. M.; Maxwell, R. M.; Gilbert, J. M.

    2017-12-01

    Over the past century, groundwater levels in California's San Joaquin Valley have dropped more than 30 meters in some areas due to excessive groundwater extraction to irrigate agricultural lands and feed a growing population. Between 2012 and 2016 California experienced the worst drought in its recorded history, further exacerbating this groundwater depletion. Due to lack of groundwater regulation, exact quantities of extracted groundwater in California are unknown and hard to quantify. We use a synthesis of integrated hydrologic model simulations and remote sensing products to quantify the impact of drought and groundwater pumping on the Central Valley water tables. The Parflow-CLM model was used to evaluate groundwater depletion in the San Joaquin River basin under multiple groundwater extraction scenarios simulated from pre-drought through recent drought years. Extraction scenarios included pre-development conditions, with no groundwater pumping; historical conditions based on decreasing groundwater level measurements; and estimated groundwater extraction rates calculated from the deficit between the predicted crop water demand, based on county land use surveys, and available surface water supplies. Results were compared to NASA's Gravity Recover and Climate Experiment (GRACE) data products to constrain water table decline from groundwater extraction during severe drought. This approach untangles various factors leading to groundwater depletion within the San Joaquin Valley both during drought and years of normal recharge to help evaluate which areas are most susceptible to groundwater overdraft, as well as further evaluating the spatially and temporally variable sustainable yield. Recent efforts to improve water management and ensure reliable water supplies are highlighted by California's Sustainable Groundwater Management Act (SGMA) which mandates Groundwater Sustainability Agencies to determine the maximum quantity of groundwater that can be withdrawn through

  5. Microalgae based biorefinery: evaluation of oil extraction methods in terms of efficiency, costs, toxicity and energy in lab-scale

    Directory of Open Access Journals (Sweden)

    Ángel Darío González-Delgado

    2013-06-01

    Full Text Available Several alternatives of microalgal metabolites extraction and transformation are being studied for achieving the total utilization of this energy crop of great interest worldwide. Microalgae oil extraction is a key stage in microalgal biodiesel production chains and their efficiency affects significantly the global process efficiency. In this study, a comparison of five oil extraction methods in lab-scale was made taking as additional parameters, besides extraction efficiency, the costs of method performing, energy requirements, and toxicity of solvents used, in order to elucidate the convenience of their incorporation to a microalgae-based topology of biorefinery. Methods analyzed were Solvent extraction assisted with high speed homogenization (SHE, Continuous reflux solvent extraction (CSE, Hexane based extraction (HBE, Cyclohexane based extraction (CBE and Ethanol-hexane extraction (EHE, for this evaluation were used the microalgae strains Nannochloropsis sp., Guinardia sp., Closterium sp., Amphiprora sp. and Navicula sp., obtained from a Colombian microalgae bioprospecting. In addition, morphological response of strains to oil extraction methods was also evaluated by optic microscopy. Results shows that although there is not a unique oil extraction method which excels in all parameters evaluated, CSE, SHE and HBE appears as promising alternatives, while HBE method is shown as the more convenient for using in lab-scale and potentially scalable for implementation in a microalgae based biorefinery

  6. Survey of simulation methods for modeling pulsed sieve-plate extraction columns

    International Nuclear Information System (INIS)

    Burkhart, L.

    1979-03-01

    The report first considers briefly the use of liquid-liquid extraction in nuclear fuel reprocessing and then describes the operation of the pulse column. Currently available simulation models of the column are reviewed, and followed by an analysis of the information presently available from which the necessary parameters can be obtained for use in a model of the column. Finally, overall conclusions are given regarding the information needed to develop an accurate model of the column for materials accountability in fuel reprocessing plants. 156 references

  7. Knowledge Based 3d Building Model Recognition Using Convolutional Neural Networks from LIDAR and Aerial Imageries

    Science.gov (United States)

    Alidoost, F.; Arefi, H.

    2016-06-01

    In recent years, with the development of the high resolution data acquisition technologies, many different approaches and algorithms have been presented to extract the accurate and timely updated 3D models of buildings as a key element of city structures for numerous applications in urban mapping. In this paper, a novel and model-based approach is proposed for automatic recognition of buildings' roof models such as flat, gable, hip, and pyramid hip roof models based on deep structures for hierarchical learning of features that are extracted from both LiDAR and aerial ortho-photos. The main steps of this approach include building segmentation, feature extraction and learning, and finally building roof labeling in a supervised pre-trained Convolutional Neural Network (CNN) framework to have an automatic recognition system for various types of buildings over an urban area. In this framework, the height information provides invariant geometric features for convolutional neural network to localize the boundary of each individual roofs. CNN is a kind of feed-forward neural network with the multilayer perceptron concept which consists of a number of convolutional and subsampling layers in an adaptable structure and it is widely used in pattern recognition and object detection application. Since the training dataset is a small library of labeled models for different shapes of roofs, the computation time of learning can be decreased significantly using the pre-trained models. The experimental results highlight the effectiveness of the deep learning approach to detect and extract the pattern of buildings' roofs automatically considering the complementary nature of height and RGB information.

  8. Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images

    Science.gov (United States)

    Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.

    2018-04-01

    A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.

  9. A Mining Algorithm for Extracting Decision Process Data Models

    Directory of Open Access Journals (Sweden)

    Cristina-Claudia DOLEAN

    2011-01-01

    Full Text Available The paper introduces an algorithm that mines logs of user interaction with simulation software. It outputs a model that explicitly shows the data perspective of the decision process, namely the Decision Data Model (DDM. In the first part of the paper we focus on how the DDM is extracted by our mining algorithm. We introduce it as pseudo-code and, then, provide explanations and examples of how it actually works. In the second part of the paper, we use a series of small case studies to prove the robustness of the mining algorithm and how it deals with the most common patterns we found in real logs.

  10. Complex Road Intersection Modelling Based on Low-Frequency GPS Track Data

    Science.gov (United States)

    Huang, J.; Deng, M.; Zhang, Y.; Liu, H.

    2017-09-01

    It is widely accepted that digital map becomes an indispensable guide for human daily traveling. Traditional road network maps are produced in the time-consuming and labour-intensive ways, such as digitizing printed maps and extraction from remote sensing images. At present, a large number of GPS trajectory data collected by floating vehicles makes it a reality to extract high-detailed and up-to-date road network information. Road intersections are often accident-prone areas and very critical to route planning and the connectivity of road networks is mainly determined by the topological geometry of road intersections. A few studies paid attention on detecting complex road intersections and mining the attached traffic information (e.g., connectivity, topology and turning restriction) from massive GPS traces. To the authors' knowledge, recent studies mainly used high frequency (1 s sampling rate) trajectory data to detect the crossroads regions or extract rough intersection models. It is still difficult to make use of low frequency (20-100 s) and easily available trajectory data to modelling complex road intersections geometrically and semantically. The paper thus attempts to construct precise models for complex road intersection by using low frequency GPS traces. We propose to firstly extract the complex road intersections by a LCSS-based (Longest Common Subsequence) trajectory clustering method, then delineate the geometry shapes of complex road intersections by a K-segment principle curve algorithm, and finally infer the traffic constraint rules inside the complex intersections.

  11. The algorithm of fast image stitching based on multi-feature extraction

    Science.gov (United States)

    Yang, Chunde; Wu, Ge; Shi, Jing

    2018-05-01

    This paper proposed an improved image registration method combining Hu-based invariant moment contour information and feature points detection, aiming to solve the problems in traditional image stitching algorithm, such as time-consuming feature points extraction process, redundant invalid information overload and inefficiency. First, use the neighborhood of pixels to extract the contour information, employing the Hu invariant moment as similarity measure to extract SIFT feature points in those similar regions. Then replace the Euclidean distance with Hellinger kernel function to improve the initial matching efficiency and get less mismatching points, further, estimate affine transformation matrix between the images. Finally, local color mapping method is adopted to solve uneven exposure, using the improved multiresolution fusion algorithm to fuse the mosaic images and realize seamless stitching. Experimental results confirm high accuracy and efficiency of method proposed in this paper.

  12. Investigation of cultivated lavender (Lavandula officinalis L. extraction and its extracts

    Directory of Open Access Journals (Sweden)

    Nađalin Vesna

    2014-01-01

    Full Text Available In this study essential oil content was determined in lavender flowers and leaves by hydrodistillation. Physical and chemical characteristics of the isolated oils were determined. By using CO2 in supercritical state the extraction of lavender flowers was performed with a selected solvent flow under isothermal and isobaric conditions. By the usage of gas chromatography in combination with mass spectrometry (GC/MS and gas chromatography with flame ionisation detector (GC/FID the qualitative and quantitative analysis of the obtained essential oil and supercritical extracts (SFE was carried out. Also, the analysis of individual SFE extracts obtained during different extraction times was performed. It turned out that the main components of the analysed samples were linalool, linalool acetate, lavandulol, caryophyllene oxide, lavandulyl acetate, terpinen-4-ol and others. Two proposed models were used for modelling the extraction system lavender flower - supercritical CO2 on the basis of experimental results obtained by examining the extraction kinetics of this system. The applied models fitted well with the experimental results.

  13. Analogy between gambling and measurement-based work extraction

    Science.gov (United States)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  14. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    Science.gov (United States)

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  15. Selective extraction of phospholipids from dairy products by micro-solid phase extraction based on titanium dioxide microcolumns followed by MALDI-TOF-MS analysis

    DEFF Research Database (Denmark)

    Calvano, Cosima; Jensen, Ole; Zambonin, Carlo

    2009-01-01

    A new micro-solid phase extraction (micro-SPE) procedure based on titanium dioxide microcolumns was developed for the selective extraction of phospholipids (PLs) from dairy products before matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) analysis. All...... the extraction steps (loading, washing, and elution) have been optimized using a synthetic mixture of PLs standard and the procedure was subsequently applied to food samples such as milk, chocolate milk and butter. The whole method demonstrated to be simpler than traditional approaches and it appears very...

  16. SEP modeling based on global heliospheric models at the CCMC

    Science.gov (United States)

    Mays, M. L.; Luhmann, J. G.; Odstrcil, D.; Bain, H. M.; Schwadron, N.; Gorby, M.; Li, Y.; Lee, K.; Zeitlin, C.; Jian, L. K.; Lee, C. O.; Mewaldt, R. A.; Galvin, A. B.

    2017-12-01

    Heliospheric models provide contextual information of conditions in the heliosphere, including the background solar wind conditions and shock structures, and are used as input to SEP models, providing an essential tool for understanding SEP properties. The global 3D MHD WSA-ENLIL+Cone model provides a time-dependent background heliospheric description, into which a spherical shaped hydrodynamic CME can be inserted. ENLIL simulates solar wind parameters and additionally one can extract the magnetic topologies of observer-connected magnetic field lines and all plasma and shock properties along those field lines. An accurate representation of the background solar wind is necessary for simulating transients. ENLIL simulations also drive SEP models such as the Solar Energetic Particle Model (SEPMOD) (Luhmann et al. 2007, 2010) and the Energetic Particle Radiation Environment Module (EPREM) (Schwadron et al. 2010). The Community Coordinated Modeling Center (CCMC) is in the process of making these SEP models available to the community and offering a system to run SEP models driven by a variety of heliospheric models available at CCMC. SEPMOD injects protons onto a sequence of observer field lines at intensities dependent on the connected shock source strength which are then integrated at the observer to approximate the proton flux. EPREM couples with MHD models such as ENLIL and computes energetic particle distributions based on the focused transport equation along a Lagrangian grid of nodes that propagate out with the solar wind. The coupled SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. The coupled ENLIL and SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. In this presentation we demonstrate several case studies of SEP event modeling at different observers based on WSA-ENLIL+Cone simulations.

  17. Bubble feature extracting based on image processing of coal flotation froth

    Energy Technology Data Exchange (ETDEWEB)

    Wang, F.; Wang, Y.; Lu, M.; Liu, W. [China University of Mining and Technology, Beijing (China). Dept of Chemical Engineering and Environment

    2001-11-01

    Using image processing the contrast ratio between the bubble on the surface of flotation froth and the image background was enhanced, and the edges of bubble were extracted. Thus a model about the relation between the statistic feature of the bubbles in the image and the cleaned coal can be established. It is feasible to extract the bubble by processing the froth image of coal flotation on the basis of analysing the shape of the bubble. By means of processing the 51 group images sampled from laboratory column, it is thought that the use of the histogram equalization of image gradation and the medium filtering can obviously improve the dynamic contrast range and the brightness of bubbles. Finally, the method of threshold value cut and the bubble edge detecting for extracting the bubble were also discussed to describe the bubble feature, such as size and shape, in the froth image and to distinguish the froth image of coal flotation. 6 refs., 3 figs.

  18. Computer - based modeling in extract sciences research -I ...

    African Journals Online (AJOL)

    Specifically, in the discipline of chemistry, it has been of great utility. Its use dates back to the 17th Century and includes such wide areas as computational chemistry, chemoinformatics, molecular mechanics, chemical dynamics, molecular dynamics, molecular graphics and algorithms. Modeling has been employed ...

  19. Extracting reaction networks from databases-opening Pandora's box.

    Science.gov (United States)

    Fearnley, Liam G; Davis, Melissa J; Ragan, Mark A; Nielsen, Lars K

    2014-11-01

    Large quantities of information describing the mechanisms of biological pathways continue to be collected in publicly available databases. At the same time, experiments have increased in scale, and biologists increasingly use pathways defined in online databases to interpret the results of experiments and generate hypotheses. Emerging computational techniques that exploit the rich biological information captured in reaction systems require formal standardized descriptions of pathways to extract these reaction networks and avoid the alternative: time-consuming and largely manual literature-based network reconstruction. Here, we systematically evaluate the effects of commonly used knowledge representations on the seemingly simple task of extracting a reaction network describing signal transduction from a pathway database. We show that this process is in fact surprisingly difficult, and the pathway representations adopted by various knowledge bases have dramatic consequences for reaction network extraction, connectivity, capture of pathway crosstalk and in the modelling of cell-cell interactions. Researchers constructing computational models built from automatically extracted reaction networks must therefore consider the issues we outline in this review to maximize the value of existing pathway knowledge. © The Author 2013. Published by Oxford University Press.

  20. Comparison of two silica-based extraction methods for DNA isolation from bones.

    Science.gov (United States)

    Rothe, Jessica; Nagy, Marion

    2016-09-01

    One of the most demanding DNA extractions is from bones and teeth due to the robustness of the material and the relatively low DNA content. The greatest challenge is due to the manifold nature of the material, which is defined by various factors, including age, storage, environmental conditions, and contamination with inhibitors. However, most published protocols do not distinguish between different types or qualities of bone material, but are described as being generally applicable. Our laboratory works with two different extraction methods based on silica membranes or the use of silica beads. We compared the amplification success of the two methods from bone samples with different qualities and in the presence of inhibitors. We found that the DNA extraction using the silica membrane method results an in higher DNA yield but also in a higher risk of co-extracting impurities, which can act as inhibitors. In contrast the silica beads method shows decreased co-extraction of inhibitors but also less DNA yield. Related to our own experiences it has to be considered that each bone material should be reviewed independently regarding the analysis and extraction method. Therefore, the most ambitious task is determining the quality of the bone material, which requires substantial experience. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. A new license plate extraction framework based on fast mean shift

    Science.gov (United States)

    Pan, Luning; Li, Shuguang

    2010-08-01

    License plate extraction is considered to be the most crucial step of Automatic license plate recognition (ALPR) system. In this paper, a region-based license plate hybrid detection method is proposed to solve practical problems under complex background in which existing large quantity of disturbing information. In this method, coarse license plate location is carried out firstly to get the head part of a vehicle. Then a new Fast Mean Shift method based on random sampling of Kernel Density Estimate (KDE) is adopted to segment the color vehicle images, in order to get candidate license plate regions. The remarkable speed-up it brings makes Mean Shift segmentation more suitable for this application. Feature extraction and classification is used to accurately separate license plate from other candidate regions. At last, tilted license plate regulation is used for future recognition steps.

  2. A Concealed Car Extraction Method Based on Full-Waveform LiDAR Data

    Directory of Open Access Journals (Sweden)

    Chuanrong Li

    2016-01-01

    Full Text Available Concealed cars extraction from point clouds data acquired by airborne laser scanning has gained its popularity in recent years. However, due to the occlusion effect, the number of laser points for concealed cars under trees is not enough. Thus, the concealed cars extraction is difficult and unreliable. In this paper, 3D point cloud segmentation and classification approach based on full-waveform LiDAR was presented. This approach first employed the autocorrelation G coefficient and the echo ratio to determine concealed cars areas. Then the points in the concealed cars areas were segmented with regard to elevation distribution of concealed cars. Based on the previous steps, a strategy integrating backscattered waveform features and the view histogram descriptor was developed to train sample data of concealed cars and generate the feature pattern. Finally concealed cars were classified by pattern matching. The approach was validated by full-waveform LiDAR data and experimental results demonstrated that the presented approach can extract concealed cars with accuracy more than 78.6% in the experiment areas.

  3. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  4. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jing, E-mail: jing.zhang2@duke.edu; Ghate, Sujata V.; Yoon, Sora C. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Duke Cancer Institute, Durham, North Carolina 27710 (United States); Departments of Biomedical Engineering and Electrical and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Kuzmiak, Cherie M. [Department of Radiology, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, North Carolina 27599 (United States); Mazurowski, Maciej A. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Duke Cancer Institute, Durham, North Carolina 27710 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2014-09-15

    Purpose: Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. Methods: The authors’ algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. Results: The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different

  5. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents

    International Nuclear Information System (INIS)

    Zhang, Jing; Ghate, Sujata V.; Yoon, Sora C.; Lo, Joseph Y.; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-01-01

    Purpose: Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. Methods: The authors’ algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. Results: The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different

  6. Towards a more accurate extraction of the SPICE netlist from MAGIC based layouts

    Energy Technology Data Exchange (ETDEWEB)

    Geronimo, G.D.

    1998-08-01

    The extraction of the SPICE netlist form MAGIC based layouts is investigated. It is assumed that the layout is fully coherent with the corresponding mask representation. The process of the extraction can be made in three steps: (1) extraction of .EXT file from layout, through MAGIC command extract; (2) extraction of the netlist from .EXT file through ext2spice extractor; and (3) correction of the netlist through ext2spice.corr program. Each of these steps introduces some approximations, most of which can be optimized, and some errors, most of which can be corrected. Aim of this work is the description of each step, of the approximations and errors on each step, and of the corresponding optimizations and corrections to be made in order to improve the accuracy of the extraction. The HP AMOS14TB 0.5 {micro}m process with linear capacitor and silicide block options and the corresponding SCN3MLC{_}SUBM.30.tech27 technology file will be used in the following examples.

  7. Modelling copper solvent extraction from acidic sulphate solutions using MOC 45

    Directory of Open Access Journals (Sweden)

    Alguacil, F. J.

    1998-05-01

    Full Text Available A mathematical model to predict the extraction has been developed for the Cu-MOC 45 system. The model consists of sets of nonlinear mass action and mass balance equations where the dimerization of the oxime was also considered. The predictive model calculated the equilibrium concentrations from the total oxime concentration, total copper concentration, initial pH value and the O/A volume phase ratio. The model suggests that the extraction of copper can be defined by the existence of two species into the organic solution: CuR2 (kext = 4.2 and Cu(HR22 (kext = 10.000- The initial oxime concentration defines the predominancy in the organic solution of both species. The model can be used to predict copper extraction isotherm as well as copper stripping with sulphuric acid.

    Se desarrolla un modelo matemático para predecir el proceso de extracción en el sistema Cu-MOC 45. El modelo consiste en un conjunto de ecuaciones en el que se considera la dimerización de la oxima. El modelo permite predecir la extracción a partir de las concentraciones totales de la oxima y del cobre, del pH inicial y de la relación de fases O/A. La extracción de cobre responde a la existencia de dos especies en la fase orgánica: CuR2 (kext = 4,2 y Cu(HR22 (kext = 10.000. La predominancia de una u otra especie en esta fase depende de la concentración inicial de la oxima. El modelo también puede predecir las isotermas de extracción y la reextracción de cobre mediante ácido sulfúrico.

  8. Antioxidant activity and sensory analysis of murtilla (Ugni molinae Turcz. fruit extracts in an oil model system

    Directory of Open Access Journals (Sweden)

    T. R. Augusto-Obara

    2017-03-01

    Full Text Available An oil model system was used to analyze the antioxidant activity of Chilean fruit extracts and to determine their odor sensory effect. Hydroalcoholic extracts from wild and 14-4 genotype murtilla (Ugni molinae Turcz. fruit were assessed by the Response Surface Methodology. The optimal conditions for producing high total phenolic-content extracts were 49.5% (v/v ethanol at 30 ºC, which yielded 18.39 and 26.14 mg GAE·g-1 dry matter, respectively. The optimized extracts were added to a lipid model system and evaluated via the Schaal Oven Test. After 96 hours, 150 and 200 mg·kg-1 oil of the wild and 14-4 genotype extracts, respectively, showed an antioxidant capacity similar to TBHQ (200 mg·kg-1 oil in terms of peroxide values and odor. Thus, murtilla fruit extracts are a natural source of antioxidants for protecting lipidic foods, such as soybean oil.

  9. Antioxidant activity and sensory analysis of murtilla (Ugni molinae Turcz.) fruit extracts in an oil model system

    International Nuclear Information System (INIS)

    Augusto-Obara, T.R.; Pirce, F.; Scheuermann, E.; Spoto, M.H.F.; Vieira, T.M.F.S.

    2017-01-01

    An oil model system was used to analyze the antioxidant activity of Chilean fruit extracts and to determine their odor sensory effect. Hydroalcoholic extracts from wild and 14-4 genotype murtilla (Ugni molinae Turcz.) fruit were assessed by the Response Surface Methodology. The optimal conditions for producing high total phenolic-content extracts were 49.5% (v/v) ethanol at 30 ºC, which yielded 18.39 and 26.14 mg GAE·g−1 dry matter, respectively. The optimized extracts were added to a lipid model system and evaluated via the Schaal Oven Test. After 96 hours, 150 and 200 mg·kg−1 oil of the wild and 14-4 genotype extracts, respectively, showed an antioxidant capacity similar to TBHQ (200 mg·kg−1 oil) in terms of peroxide values and odor. Thus, murtilla fruit extracts are a natural source of antioxidants for protecting lipidic foods, such as soybean oil. [es

  10. The Extract of Lycium depressum Stocks Enhances Wound Healing in Streptozotocin-Induced Diabetic Rats.

    Science.gov (United States)

    Naji, Siamak; Zarei, Leila; Pourjabali, Masoumeh; Mohammadi, Rahim

    2017-06-01

    In diabetes, impaired wound healing and other tissue abnormalities are considered major concerns. The aim of the present study was to assess the wound-healing activity of methanolic extracts of the extract of Lycium depressum leaves. A total of 60 healthy male Wistar diabetic rats weighing approximately 160 to 180 g and 7 weeks of age were randomized into 10 groups for incision and excision wound models: sham surgery group (SHAM), including creation of wounds and no treatment; base formulation group (FG) with creation of wounds and application of base formulation ointment; treatment group 1 (TG1) with 1 g of powder extract of the plant material in ointment; treatment group 2 (TG2) with 2 g; and treatment group 4 (TG3) with 4 g of powder extract of the plant material in ointment. A wound was induced by an excision- and incision-based wound model in male rats. The mature green leaves of L depressum were collected and authenticated. Extractions of dried leaves were carried out. For wound-healing activity, the extracts were applied topically in the form of ointment and compared with control groups. The healing of the wound was assessed based on excision, incision, hydroxyproline estimation, and biomechanical and biochemical studies. The extract of L depressum leaves enhanced wound contraction, decreased epithelialization time, increased hydroxyproline content, and improved mechanical indices and histological characteristics in treatment groups compared with SHAM and FG ( P healing in a diabetes induced model.

  11. Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System

    Directory of Open Access Journals (Sweden)

    Hongqiang Li

    2016-10-01

    Full Text Available Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias.

  12. A physiologically based biokinetic model for cesium in the human body

    International Nuclear Information System (INIS)

    Leggett, R.W.; Williams, L.R.; Melo, D.R.; Lipsztein, J.L.

    2003-01-01

    A physiologically descriptive model of the biological behavior of cesium in the human body has been constructed around a detailed blood flow model. The rate of transfer from plasma into a tissue is determined by the blood perfusion rate and the tissue-specific extraction fraction of Cs during passage from arterial to venous plasma. Information on tissue-specific extraction of Cs is supplemented with information on the Cs analogues, K and Rb, and known patterns of discrimination between these metals by tissues. The rate of return from a tissue to plasma is estimated from the relative contents of Cs in plasma and the tissue at equilibrium as estimated from environmental studies. Transfers of Cs other than exchange between plasma and tissues (e.g. secretions into the gastrointestinal tract) are based on a combination of physiological considerations and empirical data on Cs or related elements. Model predictions are consistent with the sizable database on the time-dependent distribution and retention of radiocesium in the human body

  13. Modeling the Supercritical Fluid Extraction of Essential Oils from Plant Materials

    Czech Academy of Sciences Publication Activity Database

    Sovová, Helena

    2012-01-01

    Roč. 1250, SI (2012), s. 27-33 ISSN 0021-9673 R&D Projects: GA TA ČR TA01010578 Institutional support: RVO:67985858 Keywords : supercritical fluid extraction * essential oils * model for kinetics Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 4.612, year: 2012

  14. FinFET centric variability-aware compact model extraction and generation technology supporting DTCO

    OpenAIRE

    Wang, Xingsheng; Cheng, Binjie; Reid, David; Pender, Andrew; Asenov, Plamen; Millar, Campbell; Asenov, Asen

    2015-01-01

    In this paper, we present a FinFET-focused variability-aware compact model (CM) extraction and generation technology supporting design-technology co-optimization. The 14-nm CMOS technology generation silicon on insulator FinFETs are used as testbed transistors to illustrate our approach. The TCAD simulations include a long-range process-induced variability using a design of experiment approach and short-range purely statistical variability (mismatch). The CM extraction supports a hierarchical...

  15. Computational Fluid Dynamics Based Extraction of Heat Transfer Coefficient in Cryogenic Propellant Tanks

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Current reduced-order thermal model for cryogenic propellant tanks is based on correlations built for flat plates collected in the 1950's. The use of these correlations suffers from: inaccurate geometry representation; inaccurate gravity orientation; ambiguous length scale; and lack of detailed validation. The work presented under this task uses the first-principles based Computational Fluid Dynamics (CFD) technique to compute heat transfer from tank wall to the cryogenic fluids, and extracts and correlates the equivalent heat transfer coefficient to support reduced-order thermal model. The CFD tool was first validated against available experimental data and commonly used correlations for natural convection along a vertically heated wall. Good agreements between the present prediction and experimental data have been found for flows in laminar as well turbulent regimes. The convective heat transfer between tank wall and cryogenic propellant, and that between tank wall and ullage gas were then simulated. The results showed that commonly used heat transfer correlations for either vertical or horizontal plate over predict heat transfer rate for the cryogenic tank, in some cases by as much as one order of magnitude. A characteristic length scale has been defined that can correlate all heat transfer coefficients for different fill levels into a single curve. This curve can be used for the reduced-order heat transfer model analysis.

  16. A Technology-Neutral Role-Based Collaboration Model for Software Ecosystems

    DEFF Research Database (Denmark)

    Stanciulescu, Stefan; Rabiser, Daniela; Seidl, Christoph

    2016-01-01

    by contributing a role-based collaboration model for software ecosystems to make such implicit similarities explicit and to raise awareness among developers during their ongoing efforts. We extract this model based on realization artifacts in a specific programming language located in a particular source code......In large-scale software ecosystems, many developers contribute extensions to a common software platform. Due to the independent development efforts and the lack of a central steering mechanism, similar functionality may be developed multiple times by different developers. We tackle this problem...... efforts and information of ongoing development efforts. Finally, using the collaborations defined in the formalism we model real artifacts from Marlin, a firmware for 3D printers, and we show that for the selected scenarios, the five collaborations were sufficient to raise awareness and make implicit...

  17. Automated Extraction Of Associations Between Methylated Genes and Diseases From Biomedical Literature

    KAUST Repository

    Bin Res, Arwa A.

    2012-01-01

    . Based on this model, we developed a tool that automates extraction of associations between methylated genes and diseases from electronic text. Our study contributed an efficient method for extracting specific types of associations from free text

  18. Stage wise modeling of liquid-extraction column (RDC)

    International Nuclear Information System (INIS)

    Bastani, B.

    2004-01-01

    Stage wise forward mixing model considering coalescence and re dispersion of drops was used to predict the performance of Rotating Disk Liquid Extraction Contactors. Experimental data previously obtained in two RDC columns of 7-62 cm diameter, 73.6 cm height and 21.9 cm diameter,150 cm height were used to evaluate the model predications. Drops-side mass transfer coefficients were predicted applying Hand los-baron drop model and onley's model was used to predict velocities. According to the results obtained the followings could be concluded: (1) If the height of coalescence and re dispersion i.e.:h=h p Q p / Q could be estimated, the stage wise forward mixing with coalescence and re dispersion model will predict the column height and efficiency with the acceptable accuracy, (2) The stage wise modeling predictions are highly dependent on the number of stages used when the number of stages is less than 10 and (3) Application of continuous phase mass transfer axial dispersion coefficients (k c and E c ) obtained from the solute concentration profile along the column height will predict the column performance more accurately than the Calder bank and moo-young (for K c ) and Kumar-Heartland (for E c ) correlations

  19. Image processor of model-based vision system for assembly robots

    International Nuclear Information System (INIS)

    Moribe, H.; Nakano, M.; Kuno, T.; Hasegawa, J.

    1987-01-01

    A special purpose image preprocessor for the visual system of assembly robots has been developed. The main function unit is composed of lookup tables to utilize the advantage of semiconductor memory for large scale integration, high speed and low price. More than one unit may be operated in parallel since it is designed on the standard IEEE 796 bus. The operation time of the preprocessor in line segment extraction is usually 200 ms per 500 segments, though it differs according to the complexity of scene image. The gray-scale visual system supported by the model-based analysis program using the extracted line segments recognizes partially visible or overlapping industrial workpieces, and detects these locations and orientations

  20. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  1. A Novel Technique for Shape Feature Extraction Using Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Dhanoa Jaspreet Singh

    2016-01-01

    Full Text Available With the advent of technology and multimedia information, digital images are increasing very quickly. Various techniques are being developed to retrieve/search digital information or data contained in the image. Traditional Text Based Image Retrieval System is not plentiful. Since it is time consuming as it require manual image annotation. Also, the image annotation differs with different peoples. An alternate to this is Content Based Image Retrieval (CBIR system. It retrieves/search for image using its contents rather the text, keywords etc. A lot of exploration has been compassed in the range of Content Based Image Retrieval (CBIR with various feature extraction techniques. Shape is a significant image feature as it reflects the human perception. Moreover, Shape is quite simple to use by the user to define object in an image as compared to other features such as Color, texture etc. Over and above, if applied alone, no descriptor will give fruitful results. Further, by combining it with an improved classifier, one can use the positive features of both the descriptor and classifier. So, a tryout will be made to establish an algorithm for accurate feature (Shape extraction in Content Based Image Retrieval (CBIR. The main objectives of this project are: (a To propose an algorithm for shape feature extraction using CBIR, (b To evaluate the performance of proposed algorithm and (c To compare the proposed algorithm with state of art techniques.

  2. Man-Made Object Extraction from Remote Sensing Imagery by Graph-Based Manifold Ranking

    Science.gov (United States)

    He, Y.; Wang, X.; Hu, X. Y.; Liu, S. H.

    2018-04-01

    The automatic extraction of man-made objects from remote sensing imagery is useful in many applications. This paper proposes an algorithm for extracting man-made objects automatically by integrating a graph model with the manifold ranking algorithm. Initially, we estimate a priori value of the man-made objects with the use of symmetric and contrast features. The graph model is established to represent the spatial relationships among pre-segmented superpixels, which are used as the graph nodes. Multiple characteristics, namely colour, texture and main direction, are used to compute the weights of the adjacent nodes. Manifold ranking effectively explores the relationships among all the nodes in the feature space as well as initial query assignment; thus, it is applied to generate a ranking map, which indicates the scores of the man-made objects. The man-made objects are then segmented on the basis of the ranking map. Two typical segmentation algorithms are compared with the proposed algorithm. Experimental results show that the proposed algorithm can extract man-made objects with high recognition rate and low omission rate.

  3. Solid-Liquid Extraction Kinetics of Total Phenolic Compounds (TPC from Red Dates

    Directory of Open Access Journals (Sweden)

    Bee Lin Chua

    2018-01-01

    Full Text Available Red dates are one of the most famous herbal plants in making traditional Chinese medicine. They contain large amount of bioactive compounds. The objectives of this research were to optimise the crude extract yield and total phenolic compounds (TPC yield from red dates using response surface methodology (RSM and model the extraction kinetics of TPC yield from red dates. Date fruits were dried in an oven under temperatures 50°C, 60°C, 70°C and 80°C until a constant weight was obtained. The optimum drying temperature was 60°C as it gave the highest crude extract yield and TPC yield. Besides that, single factor experiments were used to determine the optimum range of four extraction parameters which were: liquid-solid ratio (10-30 ml/g; ultrasonic power (70-90%; extraction temperature (50-70°C; and extraction time (40-60min. The optimum range of the four parameters were further optimised using the Box-Behken Design (BBD of RSM. The extraction conditions that gave the highest crude extract yield and TPC yield were chosen. The optimum value for liquid-solid ratio, ultrasonic power, extraction temperature and extraction time were 30ml/g, 70%, 60°C and 60 min respectively. The two equations generated from RSM were reliable and can be used to predict the crude extract yield and TPC yield. The higher the extraction temperature, liquid-solid ratio, and extraction time and lower ultrasonic power, the higher the crude extract and TPC yield. Finally, the results of TPC yield versus time based on the optimum extraction parameters from RSM optimisation were fitted into three extraction kinetic models (Peleg’s model, Page’s model and Ponomaryov’s model. It was found that the most suitable kinetic model to represent the extraction process of TPC from red dates was Page’s model due to its coefficient of determination (R2 was the closest to unity, 0.9663 while its root mean square error (RMSE was the closest to zero, 0.001534.

  4. Modeling Innovative Power Take-Off Based on Double-Acting Hydraulic Cylinders Array for Wave Energy Conversion

    Directory of Open Access Journals (Sweden)

    Juan Carlos Antolín-Urbaneja

    2015-03-01

    Full Text Available One of the key systems of a Wave Energy Converter for extraction of wave energy is the Power Take-Off (PTO device. This device transforms the mechanical energy of a moving body into electrical energy. This paper describes the model of an innovative PTO based on independently activated double-acting hydraulic cylinders array. The model has been developed using a simulation tool, based on a port-based approach to model hydraulics systems. The components and subsystems used in the model have been parameterized as real components and their values experimentally obtained from an existing prototype. In fact, the model takes into account most of the hydraulic losses of each component. The simulations show the flexibility to apply different restraining torques to the input movement depending on the geometrical configuration and the hydraulic cylinders on duty, easily modified by a control law. The combination of these two actions allows suitable flexibility to adapt the device to different sea states whilst optimizing the energy extraction. The model has been validated using a real test bench showing good correlations between simulation and experimental tests.

  5. PALM KERNEL OIL SOLUBITY EXAMINATION AND ITS MODELING IN EXTRACTION PROCESS USING SUPERCRITICAL CARBON DIOXIDE

    Directory of Open Access Journals (Sweden)

    Wahyu Bahari Setianto

    2013-11-01

    Full Text Available Application of  supercritical carbon dioxide (SC-CO2 to vegetable oil extraction became an attractive technique due to its high solubility, short extraction time and simple purification. The method is considered as earth friendly technology due to the absence of chemical usage. Solubility of solute-SC-CO2 is an important data for application of the SC-CO2 extraction. In this work, the equilibrium solubility of the palm kernel oil (PKO in SC-CO2 has been examined using extraction curve analysis. The examinations were performed at temperature and pressure ranges of  323.15 K to 353.15 K and 20.7 to 34.5 MPa respectively. It was obtained that the experimental solubility were from 0.0160 to 0.0503 g oil/g CO2 depend on the extraction condition. The experimental solubility data was well correlated with a solvent density based model with absolute percent deviation of 0.96. PENENTUAN KELARUTAN MINYAK INTI KELAPA SAWIT DAN PEMODELAN EKSTRAKSI DENGAN KARBON DIOKSIDA SUPERKRITIK. Sehubungan dengan kelarutan yang tinggi, waktu ekstraksi yang pendek dan pemurnian hasil yang mudah, aplikasi karbon dioksida superkritis (SC-CO2 pada ekstraksi minyak nabati menjadi sebuah teknik ekstraksi yang menarik. Karena tanpa penggunaan bahan kimia, metode ekstraksi ini dianggap sebagai teknologi yang ramah lingkungan. Kelarutan zat terlarut pada SC-CO2 merupakan data yang penting dalam aplikasi SC-CO2 pada proses ekstraksi.  Pada penelitian ini,  kelarutan kesetimbangan dari minyak biji sawit (PKO dalam SC-CO2 telah diuji dengan mengunakan analisa kurva proses ekstraksi. Pengujian kelarutan tersebut dilakukan pada rentang suhu 323,15 K sampai 353,15 K dan rentang tekanan 20,7 MPa sampai 34,5 MPa. Hasil analisa menunjukkan bahwa kelarutan kesetimbangan hasil percobaan  PKO pada SC-CO2 adalah 0.0160 g minyak/g CO2 sampai 0,0503 g minyak/g CO2 tergantung pada kondisi ekstraksi. Data kelarutan kesetimbangan hasil percobaan  telah dikorelasaikan dengan baik menggunakan

  6. Random Forest Based Coarse Locating and KPCA Feature Extraction for Indoor Positioning System

    Directory of Open Access Journals (Sweden)

    Yun Mo

    2014-01-01

    Full Text Available With the fast developing of mobile terminals, positioning techniques based on fingerprinting method draw attention from many researchers even world famous companies. To conquer some shortcomings of the existing fingerprinting systems and further improve the system performance, on the one hand, in the paper, we propose a coarse positioning method based on random forest, which is able to customize several subregions, and classify test point to the region with an outstanding accuracy compared with some typical clustering algorithms. On the other hand, through the mathematical analysis in engineering, the proposed kernel principal component analysis algorithm is applied for radio map processing, which may provide better robustness and adaptability compared with linear feature extraction methods and manifold learning technique. We build both theoretical model and real environment for verifying the feasibility and reliability. The experimental results show that the proposed indoor positioning system could achieve 99% coarse locating accuracy and enhance 15% fine positioning accuracy on average in a strong noisy environment compared with some typical fingerprinting based methods.

  7. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery

    Directory of Open Access Journals (Sweden)

    Fasahat Ullah Siddiqui

    2016-07-01

    Full Text Available Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality. Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state

  8. Antidepressant, anxiolytic and anti-nociceptive activities of ethanol extract of Steudnera colocasiifolia K. Koch leaves in mice model

    Directory of Open Access Journals (Sweden)

    Mohammad Shah Hafez Kabir

    2015-11-01

    Full Text Available Objective: To estimate the antidepressant, anxiolytic and antinociceptive activities of ethanol extract of Steudnera colocasiifolia K. Koch (S. colocasiifolia leaves. Methods: Swiss albino mice treated with 1% Tween solution, standard drugs and ethanol extract of S. colocasiifolia, respectively, were subjected to the neurological and antinociceptive investigations. The tail suspension test and forced swimming test were used for testing antidepressant activity, where the parameter is the measurement of immobility time. Anxiolytic activity was evaluated by hole board model. Anti-nociceptive potential of the extract was also screened for centrally acting analgesic activity by using formalin induced licking response model and acetic acid induced writhing test was used for testing peripheral analgesic action. Results: Ethanol extract of S. colocasiifolia significantly decreased the period of immobility in both tested models (tail suspension and forced swimming models of antidepressant activity. In the hole board model, there was a dose dependant (at 100 and 200 mg/kg and a significant increase in the number of head dipping by comparing with control (1% Tween solution (P < 0.05 and P < 0.001. In formalin induced licking model, a significant inhibition of pain compared to standard diclofenac sodium was observed (P < 0.05 and P < 0.001. In acetic acid induced test, there was a significant reduction of writhing response and pain in mice treated with leaves extract of S. colocasiifolia at 200 mg/kg body weight (P < 0.05 and P < 0.001. Conclusions: The results proofed the prospective antidepressant, anxiolytic and antinociceptive activities of ethanol extract of S. colocasiifolia leaves.

  9. A comparison of two colorimetric assays, based upon Lowry and Bradford techniques, to estimate total protein in soil extracts.

    Science.gov (United States)

    Redmile-Gordon, M A; Armenise, E; White, R P; Hirsch, P R; Goulding, K W T

    2013-12-01

    Soil extracts usually contain large quantities of dissolved humified organic material, typically reflected by high polyphenolic content. Since polyphenols seriously confound quantification of extracted protein, minimising this interference is important to ensure measurements are representative. Although the Bradford colorimetric assay is used routinely in soil science for rapid quantification protein in soil-extracts, it has several limitations. We therefore investigated an alternative colorimetric technique based on the Lowry assay (frequently used to measure protein and humic substances as distinct pools in microbial biofilms). The accuracies of both the Bradford assay and a modified Lowry microplate method were compared in factorial combination. Protein was quantified in soil-extracts (extracted with citrate), including standard additions of model protein (BSA) and polyphenol (Sigma H1675-2). Using the Lowry microplate assay described, no interfering effects of citrate were detected even with concentrations up to 5 times greater than are typically used to extract soil protein. Moreover, the Bradford assay was found to be highly susceptible to two simultaneous and confounding artefacts: 1) the colour development due to added protein was greatly inhibited by polyphenol concentration, and 2) substantial colour development was caused directly by the polyphenol addition. In contrast, the Lowry method enabled distinction between colour development from protein and non-protein origin, providing a more accurate quantitative analysis. These results suggest that the modified-Lowry method is a more suitable measure of extract protein (defined by standard equivalents) because it is less confounded by the high polyphenolic content which is so typical of soil extracts.

  10. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  11. Evaluation of DNA Extraction Methods Suitable for PCR-based Detection and Genotyping of Clostridium botulinum

    DEFF Research Database (Denmark)

    Auricchio, Bruna; Anniballi, Fabrizio; Fiore, Alfonsina

    2013-01-01

    in terms of cost, time, labor, and supplies. Eleven botulinum toxin–producing clostridia strains and 25 samples (10 food, 13 clinical, and 2 environmental samples) naturally contaminated with botulinum toxin–producing clostridia were used to compare 4 DNA extraction procedures: Chelex® 100 matrix, Phenol......Sufficient quality and quantity of extracted DNA is critical to detecting and performing genotyping of Clostridium botulinum by means of PCR-based methods. An ideal extraction method has to optimize DNA yield, minimize DNA degradation, allow multiple samples to be extracted, and be efficient...

  12. Generating unstable resonances for extraction schemes based on transverse splitting

    Directory of Open Access Journals (Sweden)

    M. Giovannozzi

    2009-02-01

    Full Text Available A few years ago, a novel multiturn extraction scheme was proposed, based on particle trapping inside stable resonances. Numerical simulations and experimental tests have confirmed the feasibility of such a scheme for low order resonances. While the third-order resonance is generically unstable and those higher than fourth order are generically stable, the fourth-order resonance can be either stable or unstable depending on the specifics of the system under consideration. By means of the normal form, a general approach to control the stability of the fourth-order resonance has been derived. This approach is based on the control of the amplitude detuning and the general form for a lattice with an arbitrary number of sextupole and octupole families is derived in this paper. Numerical simulations have confirmed the analytical results and have shown that, when crossing the unstable fourth-order resonance, the region around the center of the phase space is depleted and particles are trapped in only the four stable islands. A four-turn extraction could be designed using this technique.

  13. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei-Implications for comparative studies

    DEFF Research Database (Denmark)

    Akbari, Mansour; Krokan, Hans E

    2012-01-01

    The base excision repair (BER) pathway corrects many different DNA base lesions and is important for genomic stability. The mechanism of BER cannot easily be investigated in intact cells and therefore in vitro methods that reflect the in vivo processes are in high demand. Reconstitution of BER...... using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore......, candidate proteins in extracts can be inhibited or depleted in a controlled way, making defined extracts an important source for mechanistic studies. The major drawback is that there is no standardized method of preparing nuclear extract for BER studies, and it does not appear to be a topic given much...

  14. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  15. Homomorphic encryption-based secure SIFT for privacy-preserving feature extraction

    Science.gov (United States)

    Hsu, Chao-Yung; Lu, Chun-Shien; Pei, Soo-Chang

    2011-02-01

    Privacy has received much attention but is still largely ignored in the multimedia community. Consider a cloud computing scenario, where the server is resource-abundant and is capable of finishing the designated tasks, it is envisioned that secure media retrieval and search with privacy-preserving will be seriously treated. In view of the fact that scale-invariant feature transform (SIFT) has been widely adopted in various fields, this paper is the first to address the problem of secure SIFT feature extraction and representation in the encrypted domain. Since all the operations in SIFT must be moved to the encrypted domain, we propose a homomorphic encryption-based secure SIFT method for privacy-preserving feature extraction and representation based on Paillier cryptosystem. In particular, homomorphic comparison is a must for SIFT feature detection but is still a challenging issue for homomorphic encryption methods. To conquer this problem, we investigate a quantization-like secure comparison strategy in this paper. Experimental results demonstrate that the proposed homomorphic encryption-based SIFT performs comparably to original SIFT on image benchmarks, while preserving privacy additionally. We believe that this work is an important step toward privacy-preserving multimedia retrieval in an environment, where privacy is a major concern.

  16. A three-layer model of self-assembly induced surface-energy variation experimentally extracted by using nanomechanically sensitive cantilevers

    International Nuclear Information System (INIS)

    Zuo Guomin; Li Xinxin

    2011-01-01

    This research is aimed at elucidating surface-energy (or interfacial energy) variation during the process of molecule-layer self-assembly on a solid surface. A quasi-quantitative plotting model is proposed and established to distinguish the surface-energy variation contributed by the three characteristic layers of a thiol-on-gold self-assembled monolayer (SAM), namely the assembly-medium correlative gold/head-group layer, the chain/chain interaction layer and the tail/medium layer, respectively. The data for building the model are experimentally extracted from a set of correlative thiol self-assemblies in different media. The variation in surface-energy during self-assembly is obtained by in situ recording of the self-assembly induced nanomechanical surface-stress using integrated micro-cantilever sensors. Based on the correlative self-assembly experiment, and by using the nanomechanically sensitive self-sensing cantilevers to monitor the self-assembly induced surface-stressin situ, the experimentally extracted separate contributions of the three layers to the overall surface-energy change aid a comprehensive understanding of the self-assembly mechanism. Moreover, the quasi-quantitative modeling method is helpful for optimal design, molecule synthesis and performance evaluation of molecule self-assembly for application-specific surface functionalization.

  17. FAST DISCRETE CURVELET TRANSFORM BASED ANISOTROPIC FEATURE EXTRACTION FOR IRIS RECOGNITION

    Directory of Open Access Journals (Sweden)

    Amol D. Rahulkar

    2010-11-01

    Full Text Available The feature extraction plays a very important role in iris recognition. Recent researches on multiscale analysis provide good opportunity to extract more accurate information for iris recognition. In this work, a new directional iris texture features based on 2-D Fast Discrete Curvelet Transform (FDCT is proposed. The proposed approach divides the normalized iris image into six sub-images and the curvelet transform is applied independently on each sub-image. The anisotropic feature vector for each sub-image is derived using the directional energies of the curvelet coefficients. These six feature vectors are combined to create the resultant feature vector. During recognition, the nearest neighbor classifier based on Euclidean distance has been used for authentication. The effectiveness of the proposed approach has been tested on two different databases namely UBIRIS and MMU1. Experimental results show the superiority of the proposed approach.

  18. A feature extraction algorithm based on corner and spots in self-driving vehicles

    Directory of Open Access Journals (Sweden)

    Yupeng FENG

    2017-06-01

    Full Text Available To solve the poor real-time performance problem of the visual odometry based on embedded system with limited computing resources, an image matching method based on Harris and SIFT is proposed, namely the Harris-SIFT algorithm. On the basis of the review of SIFT algorithm, the principle of Harris-SIFT algorithm is provided. First, Harris algorithm is used to extract the corners of the image as candidate feature points, and scale invariant feature transform (SIFT features are extracted from those candidate feature points. At last, through an example, the algorithm is simulated by Matlab, then the complexity and other performance of the algorithm are analyzed. The experimental results show that the proposed method reduces the computational complexity and improves the speed of feature extraction. Harris-SIFT algorithm can be used in the real-time vision odometer system, and will bring about a wide application of visual odometry in embedded navigation system.

  19. Data-based modelling of the Earth's dynamic magnetosphere: a review

    Directory of Open Access Journals (Sweden)

    N. A. Tsyganenko

    2013-10-01

    Full Text Available This paper reviews the main advances in the area of data-based modelling of the Earth's distant magnetic field achieved during the last two decades. The essence and the principal goal of the approach is to extract maximum information from available data, using physically realistic and flexible mathematical structures, parameterized by the most relevant and routinely accessible observables. Accordingly, the paper concentrates on three aspects of the modelling: (i mathematical methods to develop a computational "skeleton" of a model, (ii spacecraft databases, and (iii parameterization of the magnetospheric models by the solar wind drivers and/or ground-based indices. The review is followed by a discussion of the main issues concerning further progress in the area, in particular, methods to assess the models' performance and the accuracy of the field line mapping. The material presented in the paper is organized along the lines of the author Julius-Bartels' Medal Lecture during the General Assembly 2013 of the European Geosciences Union.

  20. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  1. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms

    Directory of Open Access Journals (Sweden)

    Dashan Zhang

    2016-04-01

    Full Text Available The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  2. Optimal Management of Geothermal Heat Extraction

    Science.gov (United States)

    Patel, I. H.; Bielicki, J. M.; Buscheck, T. A.

    2015-12-01

    Geothermal energy technologies use the constant heat flux from the subsurface in order to produce heat or electricity for societal use. As such, a geothermal energy system is not inherently variable, like systems based on wind and solar resources, and an operator can conceivably control the rate at which heat is extracted and used directly, or converted into a commodity that is used. Although geothermal heat is a renewable resource, this heat can be depleted over time if the rate of heat extraction exceeds the natural rate of renewal (Rybach, 2003). For heat extraction used for commodities that are sold on the market, sustainability entails balancing the rate at which the reservoir renews with the rate at which heat is extracted and converted into profit, on a net present value basis. We present a model that couples natural resource economic approaches for managing renewable resources with simulations of geothermal reservoir performance in order to develop an optimal heat mining strategy that balances economic gain with the performance and renewability of the reservoir. Similar optimal control approaches have been extensively studied for renewable natural resource management of fisheries and forests (Bonfil, 2005; Gordon, 1954; Weitzman, 2003). Those models determine an optimal path of extraction of fish or timber, by balancing the regeneration of stocks of fish or timber that are not harvested with the profit from the sale of the fish or timber that is harvested. Our model balances the regeneration of reservoir temperature with the net proceeds from extracting heat and converting it to electricity that is sold to consumers. We used the Non-isothermal Unconfined-confined Flow and Transport (NUFT) model (Hao, Sun, & Nitao, 2011) to simulate the performance of a sedimentary geothermal reservoir under a variety of geologic and operational situations. The results of NUFT are incorporated into the natural resource economics model to determine production strategies that

  3. Evaluating operational vacuum for landfill biogas extraction.

    Science.gov (United States)

    Fabbricino, Massimiliano

    2007-01-01

    This manuscript proposes a practical methodology for estimating the operational vacuum for landfill biogas extraction from municipal landfills. The procedure is based on two sub-models which simulate landfill gas production from organic waste decomposition and distribution of gas pressure and gas movement induced by suction at a blower station. The two models are coupled in a single mass balance equation, obtaining a relationship between the operational vacuum and the amount of landfill gas that can be extracted from an assigned system of vertical wells. To better illustrate the procedure, it is applied to a case study, where a good agreement between simulated and measured data, within +/- 30%, is obtained.

  4. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  5. Syringe needle-based sampling coupled with liquid-phase extraction for determination of the three-dimensional distribution of l-ascorbic acid in apples.

    Science.gov (United States)

    Tang, Sheng; Lee, Hian Kee

    2016-05-15

    A novel syringe needle-based sampling approach coupled with liquid-phase extraction (NBS-LPE) was developed and applied to the extraction of l-ascorbic acid (AsA) in apple. In NBS-LPE, only a small amount of apple flesh (ca. 10mg) was sampled directly using a syringe needle and placed in a glass insert for liquid extraction of AsA by 80 μL oxalic acid-acetic acid. The extract was then directly analyzed by liquid chromatography. This new procedure is simple, convenient, almost organic solvent free, and causes far less damage to the fruit. To demonstrate the applicability of NBS-LPE, AsA levels at different sampling points in a single apple were determined to reveal the spatial distribution of the analyte in a three-dimensional model. The results also showed that this method had good sensitivity (limit of detection of 0.0097 mg/100g; limit of quantification of 0.0323 mg/100g), acceptable reproducibility (relative standard deviation of 5.01% (n=6)), a wide linear range of between 0.05 and 50mg/100g, and good linearity (r(2)=0.9921). This interesting extraction technique and modeling approach can be used to measure and monitor a wide range of compounds in various parts of different soft-matrix fruits and vegetables, including single specimens. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Supercritical fluid extraction of soybean oil from the surface of spiked quartz sand - modelling study

    OpenAIRE

    Stela Jokić; B. Nagy; K. Aladić; B. Simándi

    2013-01-01

    The extraction of soybean oil from the surface of spiked quartz sand using supercritical CO2 was investigated. Sand as solid was used; it is not porous material so the internal diffusion does not exist, all the soluble material is in the surface of the particles. Sovová’s model has been used in order to obtain an analytical solution to develop the required extraction yield curves. The model simplifies when the internal diffusion can be neglected. The external mass transfer coefficient was det...

  7. Pore-Water Extraction Scale-Up Study for the SX Tank Farm

    Energy Technology Data Exchange (ETDEWEB)

    Truex, Michael J.; Oostrom, Martinus; Wietsma, Thomas W.; Last, George V.; Lanigan, David C.

    2013-01-15

    The phenomena related to pore-water extraction from unsaturated sediments have been previously examined with limited laboratory experiments and numerical modeling. However, key scale-up issues have not yet been addressed. Laboratory experiments and numerical modeling were conducted to specifically examine pore-water extraction for sediment conditions relevant to the vadose zone beneath the SX Tank Farm at Hanford Site in southeastern Washington State. Available SX Tank Farm data were evaluated to generate a conceptual model of the subsurface for a targeted pore-water extraction application in areas with elevated moisture and Tc-99 concentration. The hydraulic properties of the types of porous media representative of the SX Tank Farm target application were determined using sediment mixtures prepared in the laboratory based on available borehole sediment particle size data. Numerical modeling was used as an evaluation tool for scale-up of pore-water extraction for targeted field applications.

  8. Ant-based extraction of rules in simple decision systems over ontological graphs

    Directory of Open Access Journals (Sweden)

    Pancerz Krzysztof

    2015-06-01

    Full Text Available In the paper, the problem of extraction of complex decision rules in simple decision systems over ontological graphs is considered. The extracted rules are consistent with the dominance principle similar to that applied in the dominancebased rough set approach (DRSA. In our study, we propose to use a heuristic algorithm, utilizing the ant-based clustering approach, searching the semantic spaces of concepts presented by means of ontological graphs. Concepts included in the semantic spaces are values of attributes describing objects in simple decision systems

  9. Development of the IMB Model and an Evidence-Based Diabetes Self-management Mobile Application.

    Science.gov (United States)

    Jeon, Eunjoo; Park, Hyeoun-Ae

    2018-04-01

    This study developed a diabetes self-management mobile application based on the information-motivation-behavioral skills (IMB) model, evidence extracted from clinical practice guidelines, and requirements identified through focus group interviews (FGIs) with diabetes patients. We developed a diabetes self-management (DSM) app in accordance with the following four stages of the system development life cycle. The functional and knowledge requirements of the users were extracted through FGIs with 19 diabetes patients. A system diagram, data models, a database, an algorithm, screens, and menus were designed. An Android app and server with an SSL protocol were developed. The DSM app algorithm and heuristics, as well as the usability of the DSM app were evaluated, and then the DSM app was modified based on heuristics and usability evaluation. A total of 11 requirement themes were identified through the FGIs. Sixteen functions and 49 knowledge rules were extracted. The system diagram consisted of a client part and server part, 78 data models, a database with 10 tables, an algorithm, and a menu structure with 6 main menus, and 40 user screens were developed. The DSM app was Android version 4.4 or higher for Bluetooth connectivity. The proficiency and efficiency scores of the algorithm were 90.96% and 92.39%, respectively. Fifteen issues were revealed through the heuristic evaluation, and the app was modified to address three of these issues. It was also modified to address five comments received by the researchers through the usability evaluation. The DSM app was developed based on behavioral change theory through IMB models. It was designed to be evidence-based, user-centered, and effective. It remains necessary to fully evaluate the effect of the DSM app on the DSM behavior changes of diabetes patients.

  10. Kinetics of microwave assisted extraction of pectin from Balinese orange peel using Pseudo-homogeneous model

    Science.gov (United States)

    Megawati, Wulansarie, Ria; Faiz, Merisa Bestari; Adi, Susatyo; Sammadikun, Waliyuddin

    2018-03-01

    The objective of this work was to study the homogeneous kinetics of pectin extraction of Balinese orange peel conducted using Microwave Assisted Extraction (MAE). The experimental data showed that the power increases (180 to 600 W), so that the extraction yield of pectin also increases (12.2 to 30.6 % w/w). Moreover, the extraction time is longer (10, 15, and 20 min) the yield of pectin increases (8.8, 20.2, and 40.5). At time after of 20 min (25 and 30 min), the yield starts to decrease (36.6 and 22.9). This phenomena shows pectin degradation. Therefore, pectin extraction is a series reaction, i.e. extraction and degradation. The calculation result showed that pseudo series homogeneous model can quantitatively describe the extraction kinetics. The kinetic constants can be expressed by Arrhenius equation with the frequency factors of 1.58 × 105 and 2.29 × 105 1/min, while the activation energies are 64,350 and 56,571 J/mole for extraction and degradation, respectively.

  11. A New Approach to Urban Road Extraction Using High-Resolution Aerial Image

    Directory of Open Access Journals (Sweden)

    Jianhua Wang

    2016-07-01

    Full Text Available Road information is fundamental not only in the military field but also common daily living. Automatic road extraction from a remote sensing images can provide references for city planning as well as transportation database and map updating. However, owing to the spectral similarity between roads and impervious structures, the current methods solely using spectral characteristics are often ineffective. By contrast, the detailed information discernible from the high-resolution aerial images enables road extraction with spatial texture features. In this study, a knowledge-based method is established and proposed; this method incorporates the spatial texture feature into urban road extraction. The spatial texture feature is initially extracted by the local Moran’s I, and the derived texture is added to the spectral bands of image for image segmentation. Subsequently, features like brightness, standard deviation, rectangularity, aspect ratio, and area are selected to form the hypothesis and verification model based on road knowledge. Finally, roads are extracted by applying the hypothesis and verification model and are post-processed based on the mathematical morphology. The newly proposed method is evaluated by conducting two experiments. Results show that the completeness, correctness, and quality of the results could reach approximately 94%, 90% and 86% respectively, indicating that the proposed method is effective for urban road extraction.

  12. A combined model of heat and mass transfer for the in situ extraction of volatile water from lunar regolith

    Science.gov (United States)

    Reiss, P.

    2018-05-01

    Chemical analysis of lunar soil samples often involves thermal processing to extract their volatile constituents, such as loosely adsorbed water. For the characterization of volatiles and their bonding mechanisms it is important to determine their desorption temperature. However, due to the low thermal diffusivity of lunar regolith, it might be difficult to reach a uniform heat distribution in a sample that is larger than only a few particles. Furthermore, the mass transport through such a sample is restricted, which might lead to a significant delay between actual desorption and measurable outgassing of volatiles from the sample. The entire volatiles extraction process depends on the dynamically changing heat and mass transfer within the sample, and is influenced by physical parameters such as porosity, tortuosity, gas density, temperature and pressure. To correctly interpret measurements of the extracted volatiles, it is important to understand the interaction between heat transfer, sorption, and gas transfer through the sample. The present paper discusses the molecular kinetics and mechanisms that are involved in the thermal extraction process and presents a combined parametrical computation model to simulate this process. The influence of water content on the gas diffusivity and thermal diffusivity is discussed and the issue of possible resorption of desorbed molecules within the sample is addressed. Based on the multi-physical computation model, a case study for the ProSPA instrument for in situ analysis of lunar volatiles is presented, which predicts relevant dynamic process parameters, such as gas pressure and process duration.

  13. Comparison of mobility extraction methods based on field-effect measurements for graphene

    Directory of Open Access Journals (Sweden)

    Hua Zhong

    2015-05-01

    Full Text Available Carrier mobility extraction methods for graphene based on field-effect measurements are explored and compared according to theoretical analysis and experimental results. A group of graphene devices with different channel lengths were fabricated and measured, and carrier mobility is extracted from those electrical transfer curves using three different methods. Accuracy and applicability of those methods were compared. Transfer length method (TLM can obtain accurate density dependent mobility and contact resistance at relative high carrier density based on data from a group of devices, and then can act as a standard method to verify other methods. As two of the most popular methods, direct transconductance method (DTM and fitting method (FTM can extract mobility easily based on transfer curve of a sole graphene device. DTM offers an underestimated mobility at any carrier density owing to the neglect of contact resistances, and the accuracy can be improved through fabricating field-effect transistors with long channel and good contacts. FTM assumes a constant mobility independent on carrier density, and then can obtain mobility, contact resistance and residual density stimulations through fitting a transfer curve. However, FTM tends to obtain a mobility value near Dirac point and then overestimates carrier mobility of graphene. Comparing with the DTM and FTM, TLM could offer a much more accurate and carrier density dependent mobility, that reflects the complete properties of graphene carrier mobility.

  14. Annotation and retrieval system of CAD models based on functional semantics

    Science.gov (United States)

    Wang, Zhansong; Tian, Ling; Duan, Wenrui

    2014-11-01

    CAD model retrieval based on functional semantics is more significant than content-based 3D model retrieval during the mechanical conceptual design phase. However, relevant research is still not fully discussed. Therefore, a functional semantic-based CAD model annotation and retrieval method is proposed to support mechanical conceptual design and design reuse, inspire designer creativity through existing CAD models, shorten design cycle, and reduce costs. Firstly, the CAD model functional semantic ontology is constructed to formally represent the functional semantics of CAD models and describe the mechanical conceptual design space comprehensively and consistently. Secondly, an approach to represent CAD models as attributed adjacency graphs(AAG) is proposed. In this method, the geometry and topology data are extracted from STEP models. On the basis of AAG, the functional semantics of CAD models are annotated semi-automatically by matching CAD models that contain the partial features of which functional semantics have been annotated manually, thereby constructing CAD Model Repository that supports model retrieval based on functional semantics. Thirdly, a CAD model retrieval algorithm that supports multi-function extended retrieval is proposed to explore more potential creative design knowledge in the semantic level. Finally, a prototype system, called Functional Semantic-based CAD Model Annotation and Retrieval System(FSMARS), is implemented. A case demonstrates that FSMARS can successfully botain multiple potential CAD models that conform to the desired function. The proposed research addresses actual needs and presents a new way to acquire CAD models in the mechanical conceptual design phase.

  15. Anti-inflammatory Activity of Ethanol Extract of Beluntas Leaves (Pluchea indica L. on Complete Freund's Adjuvant-induced Inflammatory Model

    Directory of Open Access Journals (Sweden)

    Reza Setiawan Sudirman

    2017-12-01

    Full Text Available A research about anti-inflammatory effect of Beluntas leaves extract on CFA (Complete Freund’s Adjuvant induced inflammatory model has been conducted. The objective of this research was to determine the effect of Beluntas leaves extract in alleviating CFA-induced paw edema in mice (Mus musculus. The number of mice used was 15 and was divided into 5 groups. Group I was treated with NaCMC. Group II, III, and IV were given suspension of Beluntas leaves extract 100 mg/Kg, 300 mg/Kg, and 500 mg/Kg BW, respectively. Group V was a positive control treated with suspension of diclofenac sodium 0.1 ml/10 g orally. The determination of anti-inflammatory potency was based on the average time needed to ameliorate the edema volume. The shortest  time period of edema reduction was produced by diclofenac sodium (within 9.33 days, then followed by Beluntas leaves extract with the concentration of 300 mg/Kg (within 12 days, 500 mg/Kg (within 14.33 days, and 100 mg/Kg (within 17.67 days, consecutively. These results are significantly different compared to negative control group which did not reduce the edema volume during 18 days of observation. In conclusion, ethanol extract of Beluntas leaves has an effective anti-inflamatory effect.

  16. [Realization of Heart Sound Envelope Extraction Implemented on LabVIEW Based on Hilbert-Huang Transform].

    Science.gov (United States)

    Tan, Zhixiang; Zhang, Yi; Zeng, Deping; Wang, Hua

    2015-04-01

    We proposed a research of a heart sound envelope extraction system in this paper. The system was implemented on LabVIEW based on the Hilbert-Huang transform (HHT). We firstly used the sound card to collect the heart sound, and then implemented the complete system program of signal acquisition, pretreatment and envelope extraction on LabVIEW based on the theory of HHT. Finally, we used a case to prove that the system could collect heart sound, preprocess and extract the envelope easily. The system was better to retain and show the characteristics of heart sound envelope, and its program and methods were important to other researches, such as those on the vibration and voice, etc.

  17. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    Directory of Open Access Journals (Sweden)

    U. Mallast

    2011-08-01

    Full Text Available In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxiliary information and finally evaluated in terms of hydro-geological significance. Using the example of the western catchment of the Dead Sea (Israel/Palestine, the orientation and location of the differentiated lineaments are compared to characteristics of known structural features. We demonstrate that a strong correlation between lineaments and structural features exists. Using Euclidean distances between lineaments and wells provides an assessment criterion to evaluate the hydraulic significance of detected lineaments. Based on this analysis, we suggest that the statistical analysis of lineaments allows a delineation of flow-paths and thus significant information on groundwater movements. To validate the flow-paths we compare them to existing results of groundwater models that are based on well data.

  18. Extraction of citronella (Cymbopogon nardus essential oil using supercritical co2: experimental data and mathematical modeling

    Directory of Open Access Journals (Sweden)

    C. F. Silva

    2011-06-01

    Full Text Available Citronella essential oil has more than eighty components, of which the most important ones are citronellal, geranial and limonene. They are present at high concentrations in the oil and are responsible for the repellent properties of the oil. The oil was extracted using supercritical carbon dioxide due to the high selectivity of the solvent. The operational conditions studied varied from 313.15 to 353.15 K for the temperature and the applied pressures were 6.2, 10.0, 15.0 and 180.0 MPa. Better values of efficiency of the extracted oil were obtained at higher pressure conditions. At constant temperature, the amount of extracted oil increased when the pressure increased, but the opposite occurred when the temperature increased at constant pressure. The composition of the essential oil was complex, although there were several main components in the oil and some waxes were presented in the extracted oils above 10.0 MPa. The results were modeled using a mathematical model in a predictive way, reproducing the extraction curves over the maximum time of the process.

  19. Design Analysis of Power Extracting Unit of an Onshore OWC Based Wave Energy Power Plant using Numerical Simulation

    Directory of Open Access Journals (Sweden)

    Zahid Suleman

    2011-07-01

    Full Text Available This research paper describes design and analysis of power extracting unit of an onshore OWC (Oscillating Water Column based wave energy power plant of capacity about 100 kilowatts. The OWC is modeled as solid piston of a reciprocating pump. The power extracting unit is designed analytically by using the theory of reciprocating pumps and principles of fluid mechanics. Pro-E and ANSYS workbench softwares are used to verify the analytical design. The analytical results of the flow velocity in the turbine duct are compared with the simulation results. The results are found to be in good agreement with each other. The results achieved by this research would finally assist in the overall design of the power plant which is the ultimate goal of this research work.

  20. Short-term impact of deep sand extraction and ecosystem-based landscaping on macrozoobenthos and sediment characteristics.

    Science.gov (United States)

    de Jong, Maarten F; Baptist, Martin J; Lindeboom, Han J; Hoekstra, Piet

    2015-08-15

    We studied short-term changes in macrozoobenthos in a 20m deep borrow pit. A boxcorer was used to sample macrobenthic infauna and a bottom sledge was used to sample macrobenthic epifauna. Sediment characteristics were determined from the boxcore samples, bed shear stress and near-bed salinity were estimated with a hydrodynamic model. Two years after the cessation of sand extraction, macrozoobenthic biomass increased fivefold in the deepest areas. Species composition changed significantly and white furrow shell (Abra alba) became abundant. Several sediment characteristics also changed significantly in the deepest parts. Macrozoobenthic species composition and biomass significantly correlated with time after cessation of sand extraction, sediment and hydrographical characteristics. Ecosystem-based landscaped sand bars were found to be effective in influencing sediment characteristics and macrozoobenthic assemblage. Significant changes in epifauna occurred in deepest parts in 2012 which coincided with the highest sedimentation rate. We recommend continuing monitoring to investigate medium and long-term impacts. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Extraction Of Audio Features For Emotion Recognition System Based On Music

    Directory of Open Access Journals (Sweden)

    Kee Moe Han

    2015-08-01

    Full Text Available Music is the combination of melody linguistic information and the vocalists emotion. Since music is a work of art analyzing emotion in music by computer is a difficult task. Many approaches have been developed to detect the emotions included in music but the results are not satisfactory because emotion is very complex. In this paper the evaluations of audio features from the music files are presented. The extracted features are used to classify the different emotion classes of the vocalists. Musical features extraction is done by using Music Information Retrieval MIR tool box in this paper. The database of 100 music clips are used to classify the emotions perceived in music clips. Music may contain many emotions according to the vocalists mood such as happy sad nervous bored peace etc. In this paper the audio features related to the emotions of the vocalists are extracted to use in emotion recognition system based on music.

  2. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  3. Numerical modelling of lithospheric flexure in front of subduction zones in Japan and its role to initiate melt extraction from the LVZ.

    Science.gov (United States)

    Bessat, A.; Pilet, S.; Duretz, T.; Schmalholz, S. M.

    2017-12-01

    Petit-spot volcanoes were found fifteen years ago by Japanese researchers at the top of the subducting plate in Japan (Hirano 2006). This discovery is of great significance as it highlights the importance of tectonic processes for the initiation of intraplate volcanism. The location of these small lava flows is unusual and seems to be related to the plate flexure, which may facilitate the extraction of low degree melts from the base of the lithosphere, a hypothesis previously suggested to explain changes in electric and seismic properties at 70-90 km depth, i.e. within the low velocity zone (LVS) (Sifré 2014). A critical question is related to the process associated with the extraction of this low degree melts from the LVZ. First models suggested that extension associated to plate bending allows large cracks to propagate across the lithosphere and could promote the extraction of low degree melts at the base of the lithosphere (Hirano 2006 & Yamamoto 2014). However, the study of petit-spot mantle xenoliths from Japan (Pilet 2016) has demonstrated that low degree melts are not directly extracted to the surface but percolate, interact and metasomatize the oceanic lithosphere. In order to understand the melt extraction process in the region of plate bending, we performed 2D thermo-mechanical simulations of Japanese-type subduction. The numerical model considers viscoelastoplastic deformation. This allows the quantification of state of the stress, strain rates, and viscosities which will control the percolation of melt initially stocked at the base of the lithosphere. Initial results show that plate flexure changes the distribution of the deformation mechanism in the flexure zone, between 40 km to 80 km depth. A change of the dominant deformation mechanism from diffusion creep to dislocation creep and from there to Peierls creep was observed about 200 to 300 km from the trench. These changes are linked to the augmentation of the stresses in the flexure zone. At the

  4. Fabrication and Antibacterial Effects of Polycarbonate/Leaf Extract Based Thin Films

    Directory of Open Access Journals (Sweden)

    R. Mahendran

    2016-01-01

    Full Text Available We have reported the preparation and antibacterial activities of leaf extract incorporated polycarbonate thin films to improve the antibacterial characteristics of host polycarbonates (PCs. Crude extracts of Azadirachta indica, Psidium guajava, Acalypha indica, Andrographis paniculata, and Ocimum sanctum were prepared by maceration using Dimethylformamide as solvent. The leaf extracts (LE were incorporated into the PC matrix by solution blending method, and the thin films were fabricated by Thermally Induced Phase Separation (TIPS technique. The antibacterial activities of the as-prepared films were evaluated against E. coli and S. aureus by disk diffusion method. The inhibitory effects of the PC/LE films are higher for S. aureus than the E. coli, but pristine PC film did not exhibit any remarkable antibacterial characteristics. Further, the model fruit (Prunus studies revealed that the PC/LE films retained the freshness of the fruits for more than 11 days. This study demonstrates that the PC/LE films have excellent antibacterial activities; thus, the films could be promising candidate for active antibacterial packaging applications.

  5. Morphological operation based dense houses extraction from DSM

    Science.gov (United States)

    Li, Y.; Zhu, L.; Tachibana, K.; Shimamura, H.

    2014-08-01

    This paper presents a method of reshaping and extraction of markers and masks of the dense houses from the DSM based on mathematical morphology (MM). Houses in a digital surface model (DSM) are almost joined together in high-density housing areas, and most segmentation methods cannot completely separate them. We propose to label the markers of the buildings firstly and segment them into masks by watershed then. To avoid detecting more than one marker for a house or no marker at all due to its higher neighbour, the DSM is morphologically reshaped. It is carried out by a MM operation using the certain disk shape SE of the similar size to the houses. The sizes of the houses need to be estimated before reshaping. A granulometry generated by opening-by-reconstruction to the NDSM is proposed to detect the scales of the off-terrain objects. It is a histogram of the global volume of the top hats of the convex objects in the continuous scales. The obvious step change in the profile means that there are many objects of similar sizes occur at this scale. In reshaping procedure, the slices of the object are derived by morphological filtering at the detected continuous scales and reconstructed in pile as the dome. The markers are detected on the basis of the domes.

  6. Comparison of the Effectiveness of Water-Based Extraction of Substances from Dry Tea Leaves with the Use of Magnetic Field Assisted Extraction Techniques

    Directory of Open Access Journals (Sweden)

    Grzegorz Zaguła

    2017-10-01

    Full Text Available This article presents the findings of a study investigating the feasibility of using a magnetic field assisted technique for the water-based extraction of mineral components, polyphenols, and caffeine from dry black and green tea leaves. The authors present a concept of applying constant and variable magnetic fields in the process of producing water-based infusions from selected types of tea. Analyses investigating the effectiveness of the proposed technique in comparison with conventional infusion methods assessed the contents of selected mineral components—i.e., Al, Ca, Cu, K, Mg, P, S, and Zn—which were examined with the use of ICP-OES. The contents of caffeine and polyphenolic compounds were assessed using the HPLC. A changing magnetic field permitted an increased effectiveness of extraction of the mineral components, caffeine, and polyphenols. The findings support the conclusion that a changing magnetic field assisted extraction method is useful for obtaining biologically valuable components from tea infusions.

  7. Wavelet-Based Feature Extraction in Fault Diagnosis for Biquad High-Pass Filter Circuit

    OpenAIRE

    Yuehai Wang; Yongzheng Yan; Qinyong Wang

    2016-01-01

    Fault diagnosis for analog circuit has become a prominent factor in improving the reliability of integrated circuit due to its irreplaceability in modern integrated circuits. In fact fault diagnosis based on intelligent algorithms has become a popular research topic as efficient feature extraction and selection are a critical and intricate task in analog fault diagnosis. Further, it is extremely important to propose some general guidelines for the optimal feature extraction and selection. In ...

  8. The analgesic and anti-inflammatory activities of the extract of Albizia lebbeck in animal model.

    Science.gov (United States)

    Saha, Achinto; Ahmed, Muniruddin

    2009-01-01

    The extract of the bark of Albizia lebbeck Benth. obtained by cold extraction of mixture of equal proportions of petroleum ether, ethyl acetate and methanol was chosen for pharmacological screening. In rat paw edema model induced by carrageenan, the extract at the 400 mg/kg dose level showed 36.68% (p<0.001) inhibition of edema volume at the end of 4h. In the acetic acid-induced writhing test, the extract at the 200 and 400 mg/kg dose level showed 39.9% and 52.4 % inhibition of writhing, respectively. In radiant heat tail-flick method the crude extract produced 40.74% (p<0.001) and 61.48% (p<0.001) elongation of tail flicking time 30 minutes after oral administration at the 200 and 400 mg/kg dose level, respectively.

  9. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    Directory of Open Access Journals (Sweden)

    Jaewook Jung

    2016-06-01

    Full Text Available A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1 feature extraction; (2 similarity measure; and matching, and (3 estimating exterior orientation parameters (EOPs of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process.

  10. Template-based automatic extraction of the joint space of foot bones from CT scan

    Science.gov (United States)

    Park, Eunbi; Kim, Taeho; Park, Jinah

    2016-03-01

    Clean bone segmentation is critical in studying the joint anatomy for measuring the spacing between the bones. However, separation of the coupled bones in CT images is sometimes difficult due to ambiguous gray values coming from the noise and the heterogeneity of bone materials as well as narrowing of the joint space. For fine reconstruction of the individual local boundaries, manual operation is a common practice where the segmentation remains to be a bottleneck. In this paper, we present an automatic method for extracting the joint space by applying graph cut on Markov random field model to the region of interest (ROI) which is identified by a template of 3D bone structures. The template includes encoded articular surface which identifies the tight region of the high-intensity bone boundaries together with the fuzzy joint area of interest. The localized shape information from the template model within the ROI effectively separates the bones nearby. By narrowing the ROI down to the region including two types of tissue, the object extraction problem was reduced to binary segmentation and solved via graph cut. Based on the shape of a joint space marked by the template, the hard constraint was set by the initial seeds which were automatically generated from thresholding and morphological operations. The performance and the robustness of the proposed method are evaluated on 12 volumes of ankle CT data, where each volume includes a set of 4 tarsal bones (calcaneus, talus, navicular and cuboid).

  11. Iris recognition based on key image feature extraction.

    Science.gov (United States)

    Ren, X; Tian, Q; Zhang, J; Wu, S; Zeng, Y

    2008-01-01

    In iris recognition, feature extraction can be influenced by factors such as illumination and contrast, and thus the features extracted may be unreliable, which can cause a high rate of false results in iris pattern recognition. In order to obtain stable features, an algorithm was proposed in this paper to extract key features of a pattern from multiple images. The proposed algorithm built an iris feature template by extracting key features and performed iris identity enrolment. Simulation results showed that the selected key features have high recognition accuracy on the CASIA Iris Set, where both contrast and illumination variance exist.

  12. Joint Feature Extraction and Classifier Design for ECG-Based Biometric Recognition.

    Science.gov (United States)

    Gutta, Sandeep; Cheng, Qi

    2016-03-01

    Traditional biometric recognition systems often utilize physiological traits such as fingerprint, face, iris, etc. Recent years have seen a growing interest in electrocardiogram (ECG)-based biometric recognition techniques, especially in the field of clinical medicine. In existing ECG-based biometric recognition methods, feature extraction and classifier design are usually performed separately. In this paper, a multitask learning approach is proposed, in which feature extraction and classifier design are carried out simultaneously. Weights are assigned to the features within the kernel of each task. We decompose the matrix consisting of all the feature weights into sparse and low-rank components. The sparse component determines the features that are relevant to identify each individual, and the low-rank component determines the common feature subspace that is relevant to identify all the subjects. A fast optimization algorithm is developed, which requires only the first-order information. The performance of the proposed approach is demonstrated through experiments using the MIT-BIH Normal Sinus Rhythm database.

  13. Aligning observed and modelled behaviour based on workflow decomposition

    Science.gov (United States)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  14. TWOPOT: a computer model of the two-pot extractive distillation concept for nitric acid

    International Nuclear Information System (INIS)

    Jubin, R.T.; Holland, W.D.; Counce, R.M.; Beckwith, D.R.

    1985-05-01

    A mathematical model, TWOPOT, of the ''two-pot'' extractive distillation concept for nitric acid concentration has been developed. Prediction from a computer simulation using this model shows excellent agreement with the experimental data. This model is recommended for use in the design of large-scale, similar-purpose equipment. 9 refs., 15 figs., 2 tabs

  15. Continuous Extraction of Subway Tunnel Cross Sections Based on Terrestrial Point Clouds

    Directory of Open Access Journals (Sweden)

    Zhizhong Kang

    2014-01-01

    Full Text Available An efficient method for the continuous extraction of subway tunnel cross sections using terrestrial point clouds is proposed. First, the continuous central axis of the tunnel is extracted using a 2D projection of the point cloud and curve fitting using the RANSAC (RANdom SAmple Consensus algorithm, and the axis is optimized using a global extraction strategy based on segment-wise fitting. The cross-sectional planes, which are orthogonal to the central axis, are then determined for every interval. The cross-sectional points are extracted by intersecting straight lines that rotate orthogonally around the central axis within the cross-sectional plane with the tunnel point cloud. An interpolation algorithm based on quadric parametric surface fitting, using the BaySAC (Bayesian SAmpling Consensus algorithm, is proposed to compute the cross-sectional point when it cannot be acquired directly from the tunnel points along the extraction direction of interest. Because the standard shape of the tunnel cross section is a circle, circle fitting is implemented using RANSAC to reduce the noise. The proposed approach is tested on terrestrial point clouds that cover a 150-m-long segment of a Shanghai subway tunnel, which were acquired using a LMS VZ-400 laser scanner. The results indicate that the proposed quadric parametric surface fitting using the optimized BaySAC achieves a higher overall fitting accuracy (0.9 mm than the accuracy (1.6 mm obtained by the plain RANSAC. The results also show that the proposed cross section extraction algorithm can achieve high accuracy (millimeter level, which was assessed by comparing the fitted radii with the designed radius of the cross section and comparing corresponding chord lengths in different cross sections and high efficiency (less than 3 s/section on average.

  16. Extracted sericin from silk waste for film formation

    Directory of Open Access Journals (Sweden)

    Rungsinee Sothornvit

    2010-03-01

    Full Text Available Sericin is the second main component in cocoons, which are removed in the silk reeling process of the raw silk industry and in the silk waste degumming of the spun silk industry. The main amino acid of sericin, serine, exhibits a skin moisturing and antiwrinkle action, which is interesting to use for film formation in this study. The extraction conditions of sericin from two silk wastes, pieced cocoon and inferior knubbs were studied to find the optimum extraction conditions. Boiling water extraction was considered based on the response surface methodology (RSM in order to identify the important factors for the sericin extraction. The two factors considered were time and temperature. Both factors were needed to be independent parameters in the predicted equation in order to improve the model fit with R2 = 0.84. The components ofextracted sericin were 18.24% serine, 9.83% aspatate, and 5.51% glycine with a molecular weight of 132 kDa. Film formationfrom extracted sericin was carried out to find the optimum conditions. Extracted sericin could not form a stand-alonefilm. Therefore, polysaccharide polymers, such as glucomannan, were incorporated with glycerol to form a flexible film.Sericin-based films were characterized for its properties in terms of solubility and permeability before application. It wasfound that sericin-based films showed a film flexibility and solubility without an increasing film water vapor permeability.

  17. Implicit Three-Dimensional Geo-Modelling Based on HRBF Surface

    Science.gov (United States)

    Gou, J.; Zhou, W.; Wu, L.

    2016-10-01

    Three-dimensional (3D) geological models are important representations of the results of regional geological surveys. However, the process of constructing 3D geological models from two-dimensional (2D) geological elements remains difficult and time-consuming. This paper proposes a method of migrating from 2D elements to 3D models. First, the geological interfaces were constructed using the Hermite Radial Basis Function (HRBF) to interpolate the boundaries and attitude data. Then, the subsurface geological bodies were extracted from the spatial map area using the Boolean method between the HRBF surface and the fundamental body. Finally, the top surfaces of the geological bodies were constructed by coupling the geological boundaries to digital elevation models. Based on this workflow, a prototype system was developed, and typical geological structures (e.g., folds, faults, and strata) were simulated. Geological modes were constructed through this workflow based on realistic regional geological survey data. For extended applications in 3D modelling of other kinds of geo-objects, mining ore body models and urban geotechnical engineering stratum models were constructed by this method from drill-hole data. The model construction process was rapid, and the resulting models accorded with the constraints of the original data.

  18. Interaction of Plant Extracts with Central Nervous System Receptors

    Directory of Open Access Journals (Sweden)

    Kenneth Lundstrom

    2017-02-01

    Full Text Available Background: Plant extracts have been used in traditional medicine for the treatment of various maladies including neurological diseases. Several central nervous system receptors have been demonstrated to interact with plant extracts and components affecting the pharmacology and thereby potentially playing a role in human disease and treatment. For instance, extracts from Hypericum perforatum (St. John’s wort targeted several CNS receptors. Similarly, extracts from Piper nigrum, Stephania cambodica, and Styphnolobium japonicum exerted inhibition of agonist-induced activity of the human neurokinin-1 receptor. Methods: Different methods have been established for receptor binding and functional assays based on radioactive and fluorescence-labeled ligands in cell lines and primary cell cultures. Behavioral studies of the effect of plant extracts have been conducted in rodents. Plant extracts have further been subjected to mood and cognition studies in humans. Results: Mechanisms of action at molecular and cellular levels have been elucidated for medicinal plants in support of standardization of herbal products and identification of active extract compounds. In several studies, plant extracts demonstrated affinity to a number of CNS receptors in parallel indicating the complexity of this interaction. In vivo studies showed modifications of CNS receptor affinity and behavioral responses in animal models after treatment with medicinal herbs. Certain plant extracts demonstrated neuroprotection and enhanced cognitive performance, respectively, when evaluated in humans. Noteworthy, the penetration of plant extracts and their protective effect on the blood-brain-barrier are discussed. Conclusion: The affinity of plant extracts and their isolated compounds for CNS receptors indicates an important role for medicinal plants in the treatment of neurological disorders. Moreover, studies in animal and human models have confirmed a scientific basis for the

  19. Quaternary ammonium based task specific ionic liquid for the efficient and selective extraction of neptunium

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, Nishesh Kumar [National Institute of Technology, Odisha (India). Dept. of Chemistry; Sengupta, Arijit [Bhabha Atomic Research Centre, Mumbai (India). Radiochemistry Div.; Biswas, Sujoy [Bhabha Atomic Research Centre, Mumbai (India). Uranium Extraction Div.

    2017-07-01

    Liquid-liquid extraction of neptunium from aqueous acidic solution using quaternary ammonium based task specific ionic liquid (TSIL) was investigated. The extraction of Np was predominated by the 'cation exchange' mechanism via [NpO{sub 2}.Hpth]{sup +} species for NpO{sub 2}{sup 2+}, while NpO{sub 2}{sup +} was extracted in ionic liquid as [NpO{sub 2}.H.Hpth]{sup +}. The extraction process was thermodynamically spontaneous while kinetically slower. Na{sub 2}CO{sub 3} as strippant showed quantitative back extraction of neptunium ions from TSIL. TSIL showed excellent radiolytic stability upto 500 kGy gamma exposure. Finally, the TSIL was employed for the processing of simulated high level waste solutions revealing high selectivity of TSIL towards neptunium.

  20. Optimization of PEG-based extraction of polysaccharides from Dendrobium nobile Lindl. and bioactivity study.

    Science.gov (United States)

    Zhang, Yi; Wang, Hongxin; Wang, Peng; Ma, ChaoYang; He, GuoHua; Rahman, Md Ramim Tanver

    2016-11-01

    Polyethylene glycol (PEG) as a green solvent was employed to extract polysaccharide. The optimal conditions for PEG-based ultrasonic extraction of Dendrobium nobile Lindl. polysaccharide (JCP) were determined by response surface methodology. Under the optimal conditions: extraction temperature of 58.5°C; ultrasound power of 193W, and the concentration of polyethylene glycol-200 (PEG-200) solution of 45%, the highest JCP yield was obtained as 15.23±0.57%, which was close to the predicted yield, 15.57%. UV and FT-IR analysis revealed the general characteristic absorption peaks of both JCP with water extraction (JCP w ) and PEG-200 solvent extraction (JCP p ). Thermal analysis of both JCPs was performed with Thermal Gravimetric Analyzer (TGA) and Differential Scanning Calorimeter (DSC). Antioxidant activities of two polysaccharides were also compared and no significant difference in vitro was obtained. Copyright © 2016 Elsevier B.V. All rights reserved.