WorldWideScience

Sample records for extraction system based

  1. Pattern Based Term Extraction Using ACABIT System

    CERN Document Server

    Takeuchi, Koichi; Koyama, Teruo; Daille, Béatrice; Romary, Laurent

    2009-01-01

    In this paper, we propose a pattern-based term extraction approach for Japanese, applying ACABIT system originally developed for French. The proposed approach evaluates termhood using morphological patterns of basic terms and term variants. After extracting term candidates, ACABIT system filters out non-terms from the candidates based on log-likelihood. This approach is suitable for Japanese term extraction because most of Japanese terms are compound nouns or simple phrasal patterns.

  2. Statistical feature extraction based iris recognition system

    Indian Academy of Sciences (India)

    ATUL BANSAL; RAVINDER AGARWAL; R K SHARMA

    2016-05-01

    Iris recognition systems have been proposed by numerous researchers using different feature extraction techniques for accurate and reliable biometric authentication. In this paper, a statistical feature extraction technique based on correlation between adjacent pixels has been proposed and implemented. Hamming distance based metric has been used for matching. Performance of the proposed iris recognition system (IRS) has been measured by recording false acceptance rate (FAR) and false rejection rate (FRR) at differentthresholds in the distance metric. System performance has been evaluated by computing statistical features along two directions, namely, radial direction of circular iris region and angular direction extending from pupil tosclera. Experiments have also been conducted to study the effect of number of statistical parameters on FAR and FRR. Results obtained from the experiments based on different set of statistical features of iris images show thatthere is a significant improvement in equal error rate (EER) when number of statistical parameters for feature extraction is increased from three to six. Further, it has also been found that increasing radial/angular resolution,with normalization in place, improves EER for proposed iris recognition system

  3. Extracting Answer in QA System: Learning Based

    Directory of Open Access Journals (Sweden)

    Megha Mishra

    2012-06-01

    Full Text Available Converting questions to effective queries is crucial to open-domain question answering systems. In this paper, we present a web-based unsupervised learning approach for transforming a given natural-language question to an effective query. The method involves querying a search engine for web passages that contain the answer to the question, extracting patterns that characterize fine-grained classification for answers. Independent evaluation on a set of questions shows that the proposed approach outperforms a naive keyword based approach.

  4. An Effective History-based Background Extraction System

    Directory of Open Access Journals (Sweden)

    Fatimah Khalid

    2012-01-01

    Full Text Available Problem statement: In many visions-based surveillance systems, the first step is accomplished by detecting moving objects resulted from subtraction of the current captured frame from the extracted background. So, the results of these systems mainly depend on the accuracy of the background image. Approach: In this study, a proposed background extraction system is presented to model the background using a simple method, to initialize the model, to extract the moving objects and to construct the final background. Our model saves the history of each pixel separately. It uses the saved information to extract the background using a probability-based method. It updates the history of the pixel consequently and according to the value of that pixel in the current captured image. Results: Results of the experiments certify that not only the quality of the final extracted background is the best between four recently re-implemented methods, but also the time consumption of the extraction is acceptable. Conclusion: Since History-based methods use temporal information extracted from the several previous frames, they are less sensitive to noise and sudden changes for extracting the background image.

  5. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  6. Model-Based Extracted Water Desalination System for Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Dees, Elizabeth M. [General Electric Global Research Center, Niskayuna, NY (United States); Moore, David Roger [General Electric Global Research Center, Niskayuna, NY (United States); Li, Li [Pennsylvania State Univ., University Park, PA (United States); Kumar, Manish [Pennsylvania State Univ., University Park, PA (United States)

    2017-05-28

    Over the last 1.5 years, GE Global Research and Pennsylvania State University defined a model-based, scalable, and multi-stage extracted water desalination system that yields clean water, concentrated brine, and, optionally, salt. The team explored saline brines that ranged across the expected range for extracted water for carbon sequestration reservoirs (40,000 up to 220,000 ppm total dissolved solids, TDS). In addition, the validated the system performance at pilot scale with field-sourced water using GE’s pre-pilot and lab facilities. This project encompassed four principal tasks, in addition to Project Management and Planning: 1) identify a deep saline formation carbon sequestration site and a partner that are suitable for supplying extracted water; 2) conduct a techno-economic assessment and down-selection of pre-treatment and desalination technologies to identify a cost-effective system for extracted water recovery; 3) validate the downselected processes at the lab/pre-pilot scale; and 4) define the scope of the pilot desalination project. Highlights from each task are described below: Deep saline formation characterization The deep saline formations associated with the five DOE NETL 1260 Phase 1 projects were characterized with respect to their mineralogy and formation water composition. Sources of high TDS feed water other than extracted water were explored for high TDS desalination applications, including unconventional oil and gas and seawater reverse osmosis concentrate. Technoeconomic analysis of desalination technologies Techno-economic evaluations of alternate brine concentration technologies, including humidification-dehumidification (HDH), membrane distillation (MD), forward osmosis (FO), turboexpander-freeze, solvent extraction and high pressure reverse osmosis (HPRO), were conducted. These technologies were evaluated against conventional falling film-mechanical vapor recompression (FF-MVR) as a baseline desalination process. Furthermore, a

  7. An Automated Video Object Extraction System Based on Spatiotemporal Independent Component Analysis and Multiscale Segmentation

    Directory of Open Access Journals (Sweden)

    Zhang Xiao-Ping

    2006-01-01

    Full Text Available Video content analysis is essential for efficient and intelligent utilizations of vast multimedia databases over the Internet. In video sequences, object-based extraction techniques are important for content-based video processing in many applications. In this paper, a novel technique is developed to extract objects from video sequences based on spatiotemporal independent component analysis (stICA and multiscale analysis. The stICA is used to extract the preliminary source images containing moving objects in video sequences. The source image data obtained after stICA analysis are further processed using wavelet-based multiscale image segmentation and region detection techniques to improve the accuracy of the extracted object. An automated video object extraction system is developed based on these new techniques. Preliminary results demonstrate great potential for the new stICA and multiscale-segmentation-based object extraction system in content-based video processing applications.

  8. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  9. MINUTIAE EXTRACTION BASED ON ARTIFICIAL NEURAL NETWORKS FOR AUTOMATIC FINGERPRINT RECOGNITION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Necla ÖZKAYA

    2007-01-01

    Full Text Available Automatic fingerprint recognition systems are utilised for personal identification with the use of comparisons of local ridge characteristics and their relationships. Critical stages in personal identification are to extract features automatically, fast and reliably from the input fingerprint images. In this study, a new approach based on artificial neural networks to extract minutiae from fingerprint images is developed and introduced. The results have shown that artificial neural networks achieve the minutiae extraction from fingerprint images with high accuracy.

  10. Diagonal Based Feature Extraction for Handwritten Alphabets Recognition System using Neural Network

    CERN Document Server

    Pradeep, J; Himavathi, S; 10.5121/ijcsit.2011.3103

    2011-01-01

    An off-line handwritten alphabetical character recognition system using multilayer feed forward neural network is described in the paper. A new method, called, diagonal based feature extraction is introduced for extracting the features of the handwritten alphabets. Fifty data sets, each containing 26 alphabets written by various people, are used for training the neural network and 570 different handwritten alphabetical characters are used for testing. The proposed recognition system performs quite well yielding higher levels of recognition accuracy compared to the systems employing the conventional horizontal and vertical methods of feature extraction. This system will be suitable for converting handwritten documents into structural text form and recognizing handwritten names.

  11. Improved extraction of fluoroquinolones with recyclable ionic-liquid-based aqueous biphasic systems.

    Science.gov (United States)

    Almeida, Hugo F D; Freire, Mara G; Marrucho, Isabel M

    2016-05-07

    In the past few years, the improvement of advanced analytical tools allowed to confirm the presence of trace amounts of metabolized and unchanged active pharmaceutical ingredients (APIs) in wastewater treatment plants (WWTPs) as well as in freshwater surfaces. It is known that the continuous contact with APIs, even at very low concentrations (ng L(-1)-μg L(-1)), leads to serious human health problems. In this context, this work shows the feasibility of using ionic-liquid-based aqueous biphasic systems (IL-based ABS) in the extraction of quinolones present in aqueous media. In particular, ABS composed of imidazolium- and phosphonium-based ILs and aluminium-based salts (already used in water treatment plants) were evaluated in one-step extractions of six fluoroquinolones (FQs), namely ciprofloxacin, enrofloxacin, moxifloxacin, norfloxacin, ofloxacin and sarafloxacin, and extraction efficiencies up to 98% were obtained. Despite the large interest devoted to IL-based ABS as extractive systems of outstanding performance, their recyclability/reusability has seldomly been studied. An efficient extraction/cleaning process of the IL-rich phase is here proposed by FQs induced precipitation. The recycling of the IL and its further reuse without losses in the ABS extractive performance for FQs were established, as confirmed by the four consecutive removal/extraction cycles evaluated. This novel recycling strategy supports IL-based ABS as sustainable and cost-efficient extraction platforms.

  12. Texture based feature extraction methods for content based medical image retrieval systems.

    Science.gov (United States)

    Ergen, Burhan; Baykara, Muhammet

    2014-01-01

    The developments of content based image retrieval (CBIR) systems used for image archiving are continued and one of the important research topics. Although some studies have been presented general image achieving, proposed CBIR systems for archiving of medical images are not very efficient. In presented study, it is examined the retrieval efficiency rate of spatial methods used for feature extraction for medical image retrieval systems. The investigated algorithms in this study depend on gray level co-occurrence matrix (GLCM), gray level run length matrix (GLRLM), and Gabor wavelet accepted as spatial methods. In the experiments, the database is built including hundreds of medical images such as brain, lung, sinus, and bone. The results obtained in this study shows that queries based on statistics obtained from GLCM are satisfied. However, it is observed that Gabor Wavelet has been the most effective and accurate method.

  13. A green deep eutectic solvent-based aqueous two-phase system for protein extracting

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Kaijia; Wang, Yuzhi, E-mail: wyzss@hnu.edu.cn; Huang, Yanhua; Li, Na; Wen, Qian

    2015-03-15

    Highlights: • A strategy for the protein purification with a deep eutectic solvent(DES)-based aqueous two-phase system. • Choline chloride-glycerin DES was selected as the extraction solvent. • Bovine serum albumin and trypsin were used as the analytes. • Aggregation phenomenon was detected in the mechanism research. - Abstract: As a new type of green solvent, deep eutectic solvent (DES) has been applied for the extraction of proteins with an aqueous two-phase system (ATPS) in this work. Four kinds of choline chloride (ChCl)-based DESs were synthesized to extract bovine serum albumin (BSA), and ChCl-glycerol was selected as the suitable extraction solvent. Single factor experiments have been done to investigate the effects of the extraction process, including the amount of DES, the concentration of salt, the mass of protein, the shaking time, the temperature and PH value. Experimental results show 98.16% of the BSA could be extracted into the DES-rich phase in a single-step extraction under the optimized conditions. A high extraction efficiency of 94.36% was achieved, while the conditions were applied to the extraction of trypsin (Try). Precision, repeatability and stability experiments were studied and the relative standard deviations (RSD) of the extraction efficiency were 0.4246% (n = 3), 1.6057% (n = 3) and 1.6132% (n = 3), respectively. Conformation of BSA was not changed during the extraction process according to the investigation of UV–vis spectra, FT-IR spectra and CD spectra of BSA. The conductivity, dynamic light scattering (DLS) and transmission electron microscopy (TEM) were used to explore the mechanism of the extraction. It turned out that the formation of DES–protein aggregates play a significant role in the separation process. All the results suggest that ChCl-based DES-ATPS are supposed to have the potential to provide new possibilities in the separation of proteins.

  14. COSMO-RS-based extractant screening for phenol extraction as model system

    NARCIS (Netherlands)

    Burghoff, B.; Goetheer, E.L.V.; Haan, A.B. de

    2008-01-01

    The focus of this investigation is the development of a fast and reliable extractant screening approach. Phenol extraction is selected as the model process. A quantum chemical conductor-like screening model for real solvents (COSMO-RS) is combined with molecular design considerations. For this purpo

  15. Biosensor method and system based on feature vector extraction

    Science.gov (United States)

    Greenbaum, Elias [Knoxville, TN; Rodriguez, Jr., Miguel; Qi, Hairong [Knoxville, TN; Wang, Xiaoling [San Jose, CA

    2012-04-17

    A method of biosensor-based detection of toxins comprises the steps of providing at least one time-dependent control signal generated by a biosensor in a gas or liquid medium, and obtaining a time-dependent biosensor signal from the biosensor in the gas or liquid medium to be monitored or analyzed for the presence of one or more toxins selected from chemical, biological or radiological agents. The time-dependent biosensor signal is processed to obtain a plurality of feature vectors using at least one of amplitude statistics and a time-frequency analysis. At least one parameter relating to toxicity of the gas or liquid medium is then determined from the feature vectors based on reference to the control signal.

  16. Multiple-Feature Extracting Modules Based Leak Mining System Design

    Science.gov (United States)

    Cho, Ying-Chiang; Pan, Jen-Yi

    2013-01-01

    Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing. PMID:24453892

  17. Extraction of proteins with ionic liquid aqueous two-phase system based on guanidine ionic liquid.

    Science.gov (United States)

    Zeng, Qun; Wang, Yuzhi; Li, Na; Huang, Xiu; Ding, Xueqin; Lin, Xiao; Huang, Songyun; Liu, Xiaojie

    2013-11-15

    Eight kinds of green ionic liquids were synthesized, and an ionic liquid aqueous two-phase system (ILATPS) based on 1,1,3,3-tetramethylguandine acrylate (TMGA) guanidine ionic liquid was first time studied for the extraction of proteins. Single factor experiments proved that the extraction efficiency of bovine serum albumin (BSA) was influenced by the mass of IL, K2HPO4 and BSA, also related to the separation time and temperature. The optimum conditions were determined through orthogonal experiment by the five factors described above. The results showed that under the optimum conditions, the extraction efficiency could reach up to 99.6243%. The relative standard deviations (RSD) of extraction efficiencies in precision experiment, repeatability experiment and stability experiment were 0.8156% (n=5), 1.6173% (n=5) and 1.6292% (n=5), respectively. UV-vis and FT-IR spectra confirmed that there were no chemical interactions between BSA and ionic liquid in the extraction process, and the conformation of the protein was not changed after extraction. The conductivity, DLS and TEM were combined to investigate the microstructure of the top phase and the possible mechanism for the extraction. The results showed that hydrophobic interaction, hydrogen bonding interaction and the salt out effect played important roles in the transferring process, and the aggregation and embrace phenomenon was the main driving force for the separation. All these results proved that guanidine ionic liquid-based ATPSs have the potential to offer new possibility in the extraction of proteins.

  18. Ant-based extraction of rules in simple decision systems over ontological graphs

    Directory of Open Access Journals (Sweden)

    Pancerz Krzysztof

    2015-06-01

    Full Text Available In the paper, the problem of extraction of complex decision rules in simple decision systems over ontological graphs is considered. The extracted rules are consistent with the dominance principle similar to that applied in the dominancebased rough set approach (DRSA. In our study, we propose to use a heuristic algorithm, utilizing the ant-based clustering approach, searching the semantic spaces of concepts presented by means of ontological graphs. Concepts included in the semantic spaces are values of attributes describing objects in simple decision systems

  19. Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System

    Directory of Open Access Journals (Sweden)

    Hongqiang Li

    2016-10-01

    Full Text Available Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias.

  20. A Neuro-Fuzzy System for Extracting Environment Features Based on Ultrasonic Sensors

    Directory of Open Access Journals (Sweden)

    Evelio José González

    2009-12-01

    Full Text Available In this paper, a method to extract features of the environment based on ultrasonic sensors is presented. A 3D model of a set of sonar systems and a workplace has been developed. The target of this approach is to extract in a short time, while the vehicle is moving, features of the environment. Particularly, the approach shown in this paper has been focused on determining walls and corners, which are very common environment features. In order to prove the viability of the devised approach, a 3D simulated environment has been built. A Neuro-Fuzzy strategy has been used in order to extract environment features from this simulated model. Several trials have been carried out, obtaining satisfactory results in this context. After that, some experimental tests have been conducted using a real vehicle with a set of sonar systems. The obtained results reveal the satisfactory generalization properties of the approach in this case.

  1. Aqueous two-phase system based on natural quaternary ammonium compounds for the extraction of proteins.

    Science.gov (United States)

    Zeng, Chao-Xi; Xin, Rui-Pu; Qi, Sui-Jian; Yang, Bo; Wang, Yong-Hua

    2016-02-01

    Aqueous two-phase systems, based on the use of natural quaternary ammonium compounds, were developed to establish a benign biotechnological route for efficient protein separation. In this study, aqueous two-phase systems of two natural resources betaine and choline with polyethyleneglycol (PEG400/600) or inorganic salts (K2 HPO4 /K3 PO4 ) were formed. It was shown that in the K2 HPO4 -containing aqueous two-phase system, hydrophobic interactions were an important driving force of protein partitioning, while protein size played a vital role in aqueous two-phase systems that contained polyethylene glycol. An extraction efficiency of more than 90% for bovine serum albumin in the betaine/K2 HPO4 aqueous two-phase system can be obtained, and this betaine-based aqueous two-phase system provided a gentle and stable environment for the protein. In addition, after investigation of the cluster phenomenon in the betaine/K2 HPO4 aqueous two-phase systems, it was suggested that this phenomenon also played a significant role for protein extraction in this system. The development of aqueous two-phase systems based on natural quaternary ammonium compounds not only provided an effective and greener method of aqueous two-phase system to meet the requirements of green chemistry but also may help to solve the mystery of the compartmentalization of biomolecules in cells.

  2. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  3. [Optimization of genomic DNA extraction with magnetic bead- based semi-automatic system].

    Science.gov (United States)

    Ling, Jie; Wang, Hao; Zhang, Shuai; Zhang, Dan-dan; Lai, Mao-de; Zhu, Yi-min

    2012-05-01

    To develop a rapid and effective method for genomic DNA extraction with magnetic bead-based semi-automatic system. DNA was extracted from whole blood samples semi-automatically with nucleic acid automatic extraction system.The concentration and purity of samples was determined by UV-spectrophotometer. Orthogonal design was used to analyze the main effect of lysis time, blood volume, magnetic bead quantity and ethanol concentration on the DNA yield; also the 2-way interaction of these factors. Lysis time, blood volume, magnetic bead quantity and ethanol concentration were associated with DNA yield (PDNA yield was higher under the condition with 15 min of lysis time, 100 μl of blood volume, 80 μl of magnetic beads and 80 % of ethanol. A significant association was found between the magnetic bead quantity and DNA purity OD260/OD280 (P=0.008). Interaction of blood volume and lysis time also existed (P=0.013). DNA purity was better when the extracting condition was 40 μl of magnetic beads, 15 min of lysis time and 100 μl of blood volume. Magnetic beads and ethanol concentration were associated with DNA purity OD260/OD230 (P=0.017 and Pgenomic DNA from the whole blood samples.

  4. A knowledge-based decision support system in bioinformatics: an application to protein complex extraction

    Directory of Open Access Journals (Sweden)

    Fiannaca Antonino

    2013-01-01

    Full Text Available Abstract Background We introduce a Knowledge-based Decision Support System (KDSS in order to face the Protein Complex Extraction issue. Using a Knowledge Base (KB coding the expertise about the proposed scenario, our KDSS is able to suggest both strategies and tools, according to the features of input dataset. Our system provides a navigable workflow for the current experiment and furthermore it offers support in the configuration and running of every processing component of that workflow. This last feature makes our system a crossover between classical DSS and Workflow Management Systems. Results We briefly present the KDSS' architecture and basic concepts used in the design of the knowledge base and the reasoning component. The system is then tested using a subset of Saccharomyces cerevisiae Protein-Protein interaction dataset. We used this subset because it has been well studied in literature by several research groups in the field of complex extraction: in this way we could easily compare the results obtained through our KDSS with theirs. Our system suggests both a preprocessing and a clustering strategy, and for each of them it proposes and eventually runs suited algorithms. Our system's final results are then composed of a workflow of tasks, that can be reused for other experiments, and the specific numerical results for that particular trial. Conclusions The proposed approach, using the KDSS' knowledge base, provides a novel workflow that gives the best results with regard to the other workflows produced by the system. This workflow and its numeric results have been compared with other approaches about PPI network analysis found in literature, offering similar results.

  5. Parallel Feature Extraction System

    Institute of Scientific and Technical Information of China (English)

    MAHuimin; WANGYan

    2003-01-01

    Very high speed image processing is needed in some application specially for weapon. In this paper, a high speed image feature extraction system with parallel structure was implemented by Complex programmable logic device (CPLD), and it can realize image feature extraction in several microseconds almost with no delay. This system design is presented by an application instance of flying plane, whose infrared image includes two kinds of feature: geometric shape feature in the binary image and temperature-feature in the gray image. Accordingly the feature extraction is taken on the two kind features. Edge and area are two most important features of the image. Angle often exists in the connection of the different parts of the target's image, which indicates that one area ends and the other area begins. The three key features can form the whole presentation of an image. So this parallel feature extraction system includes three processing modules: edge extraction, angle extraction and area extraction. The parallel structure is realized by a group of processors, every detector is followed by one route of processor, every route has the same circuit form, and works together at the same time controlled by a set of clock to realize feature extraction. The extraction system has simple structure, small volume, high speed, and better stability against noise. It can be used in the war field recognition system.

  6. Curtailment of soil vapor extraction systems at McClellan Air Force Base

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, T.E. [BDM Federal, McClellan AFB, CA (United States); Mook, P.H. Jr.; Wong, K.B. [SM-ALC/EMR, McClellan AFB, CA (United States)

    1997-12-31

    McClellan Air Force Base (AFB), located near Sacramento, California, is one of the Strategic Environmental Research and Development Program`s National Environmental Technology Test Sites. McClellan AFB has implemented soil vapor extraction (SVE) as an Engineering Evaluation/Cost Analysis (EE/CA) non-time-critical remedial action for volatile organic compounds in soil. Operation and maintenance costs for SVE systems are increasingly becoming a major component of the environmental clean-up budget. In an effort to reduce costs, while assuring the protection of public health and the environment, a risk-based strategy has been developed for the curtailment and eventual shut-down of SVE systems at McClellan AFB. This paper presents an overview of the SVE EE/CA process and a detailed description of the development and implementation of the curtailment strategy. Included in the discussion are details of the public and regulatory involvement in the process.

  7. Research on the image fusion and target extraction based on bionic compound eye system

    Science.gov (United States)

    Zhang, Shaowei; Hao, Qun; Song, Yong; Wang, Zihan; Zhang, Kaiyu; Zhang, Shiyu

    2015-08-01

    People attach more and more importance to bionic compound eye due to its advantages such as small volume, large field of view and sensitivity to high-speed moving objects. Small field of view and large volume are the disadvantages of traditional image sensor and in order to avoid these defects, this paper intends to build a set of compound eye system based on insect compound eye structure and visual processing mechanism. In the center of this system is the primary sensor which has high resolution ratio. The primary sensor is surrounded by the other six sensors which have low resolution ratio. Based on this system, this paper will study the target image fusion and extraction method by using plane compound eye structure. This paper designs a control module which can combine the distinguishing features of high resolution image with local features of low resolution image so as to conduct target detection, recognition and location. Compared with traditional ways, the way of high resolution in the center and low resolution around makes this system own the advantages of high resolution and large field of view and enables the system to detect the object quickly and recognize the object accurately.

  8. [Application in methane extraction of fiber methane monitoring system based on spectral absorption].

    Science.gov (United States)

    Zhao, Yan-jie; Wang, Chang; Liu, Tong-yu; Wang, Zhe; Wei, Yu-bin; Li, Yan-fang; Shang, Ying; Wang, Qian

    2010-10-01

    An optical fiber distributed multi-point methane real-time monitoring system based on the methane spectral absorption characteristic is researched, and it's application in methane extraction is presented. An 1665 nm distributed feedback (DFB) laser is used as the light source by taking the triangular signal to modulate the light frequency of the DFB laser. Using the combination of single-chip computer C8051F410, A/D transform circuit, communication circuit, display circuit, etc, the concentration of methane can be monitored and displayed on the screen. And the function of sounding the alarm bell and communication are achieved. The laser wavelength shift is carried out with adaptive adjustment by the built-in gas calibration pond so as to realize the locking of a methane absorption line. Several field tests have been founded at home and abroad. The results show that the system has good performance in stability and sensitivity. The distributed multi-point methane concentration monitoring is realized in the range of 0%-100%. A sensitivity of ppm order of magnitude has been achieved. It possesses of wide application in methane extraction.

  9. Random Forest Based Coarse Locating and KPCA Feature Extraction for Indoor Positioning System

    Directory of Open Access Journals (Sweden)

    Yun Mo

    2014-01-01

    Full Text Available With the fast developing of mobile terminals, positioning techniques based on fingerprinting method draw attention from many researchers even world famous companies. To conquer some shortcomings of the existing fingerprinting systems and further improve the system performance, on the one hand, in the paper, we propose a coarse positioning method based on random forest, which is able to customize several subregions, and classify test point to the region with an outstanding accuracy compared with some typical clustering algorithms. On the other hand, through the mathematical analysis in engineering, the proposed kernel principal component analysis algorithm is applied for radio map processing, which may provide better robustness and adaptability compared with linear feature extraction methods and manifold learning technique. We build both theoretical model and real environment for verifying the feasibility and reliability. The experimental results show that the proposed indoor positioning system could achieve 99% coarse locating accuracy and enhance 15% fine positioning accuracy on average in a strong noisy environment compared with some typical fingerprinting based methods.

  10. SPS extraction systems

    CERN Multimedia

    CERN PhotoLab

    1973-01-01

    One of the 3-m long electrostatics septa. The septum itself consists of 0.15 mm thick molybdenum wires with a 1.5 mm pitch. Each of the two SPS extraction systems will contain four of these electrostatic septa.

  11. Development of green betaine-based deep eutectic solvent aqueous two-phase system for the extraction of protein.

    Science.gov (United States)

    Li, Na; Wang, Yuzhi; Xu, Kaijia; Huang, Yanhua; Wen, Qian; Ding, Xueqin

    2016-05-15

    Six kinds of new type of green betaine-based deep eutectic solvents (DESs) have been synthesized. Deep eutectic solvent aqueous two-phase systems (DES-ATPS) were established and successfully applied in the extraction of protein. Betaine-urea (Be-U) was selected as the suitable extractant. Single factor experiments were carried out to determine the optimum conditions of the extraction process, such as the salt concentration, the mass of DES, the separation time, the amount of protein, the temperature and the pH value. The extraction efficiency could achieve to 99.82% under the optimum conditions. Mixed sample and practical sample analysis were discussed. The back extraction experiment was implemented and the back extraction efficiency could reach to 32.66%. The precision experiment, repeatability experiment and stability experiment were investigated. UV-vis, FT-IR and circular dichroism (CD) spectra confirmed that the conformation of protein was not changed during the process of extraction. The mechanisms of extraction were researched by dynamic light scattering (DLS), the measurement of the conductivity and transmission electron microscopy (TEM). DES-protein aggregates and embraces phenomenon play considerable roles in the separation process. All of these results indicated that betaine-based DES-ATPS may provide a potential substitute new method for the separation of proteins.

  12. Enhanced extraction of proteins using cholinium-based ionic liquids as phase-forming components of aqueous biphasic systems.

    Science.gov (United States)

    Quental, Maria V; Caban, Magda; Pereira, Matheus M; Stepnowski, Piotr; Coutinho, João A P; Freire, Mara G

    2015-09-01

    Aqueous biphasic systems (ABS) composed of ionic liquids (ILs) are promising platforms for the extraction and purification of proteins. In this work, a series of alternative and biocompatible ABS composed of cholinium-based ILs and polypropylene glycol were investigated. The respective ternary phase diagrams, tie-lines, tie-line lengths and critical points were determined at 25°C. The extraction performance of these systems for commercial bovine serum albumin (BSA) was then evaluated. The stability of BSA at the IL-rich phase was ascertained by size exclusion high-performance liquid chromatography and Fourier transform infrared spectroscopy. Appropriate ILs lead to the complete extraction of BSA for the IL-rich phase, in a single step, while maintaining the protein's native conformation. Furthermore, to evaluate the performance of these systems when applied to real matrices, the extraction of BSA from bovine serum was additionally carried out, revealing that the complete extraction of BSA was maintained and achieved in a single step. The remarkable extraction efficiencies obtained are far superior to those observed with typical polymer-based ABS. Therefore, the proposed ABS may be envisaged as a more effective and biocompatible approach for the separation and purification of other value-added proteins.

  13. A HOS-based Blind Signal Extraction Method for Chaotic MIMO Systems

    Institute of Scientific and Technical Information of China (English)

    GONG Yun-rui; HE Di; HE Chen; JIANG Ling-ge

    2008-01-01

    A novel method to extract multiple input and multiple output (MIMO) chaotic signals was pro-posed using the blind neural algorithm after transmitting in nonideal channel. The MIMO scheme with different chaotic signal generators was presented. In order to separate the chaotic source signals only by using the sensor signals at receivers, a blind neural extraction algorithm based on higher-order statistic (HOS) technique was used to recover the primary chaotic signals. Simulation results show that the proposed approach has good performance in separating the primary chaotic signals even under nonideal channel.

  14. Mobile helium-3 mining and extraction system and its benefits toward lunar base self-sufficiency

    Science.gov (United States)

    Sviatoslavsky, I. N.; Jacobs, M.

    The paper examines the issues of extracting He-3 from lunar regolith using mobile miners and its implications for the fusion-energy resupply of a lunar base. These issues include excavating, conveying, beneficiating, and heating the regloith, as well as collecting, transporting, and condensing the released solar-wind products. The benefits of such an operation toward lunar base self-sufficiency are described along with terrestrial benefits.

  15. Fuzzy Linguistic Knowledge Based Behavior Extraction for Building Energy Management Systems

    Energy Technology Data Exchange (ETDEWEB)

    Dumidu Wijayasekara; Milos Manic

    2013-08-01

    Significant portion of world energy production is consumed by building Heating, Ventilation and Air Conditioning (HVAC) units. Thus along with occupant comfort, energy efficiency is also an important factor in HVAC control. Modern buildings use advanced Multiple Input Multiple Output (MIMO) control schemes to realize these goals. However, since the performance of HVAC units is dependent on many criteria including uncertainties in weather, number of occupants, and thermal state, the performance of current state of the art systems are sub-optimal. Furthermore, because of the large number of sensors in buildings, and the high frequency of data collection, large amount of information is available. Therefore, important behavior of buildings that compromise energy efficiency or occupant comfort is difficult to identify. This paper presents an easy to use and understandable framework for identifying such behavior. The presented framework uses human understandable knowledge-base to extract important behavior of buildings and present it to users via a graphical user interface. The presented framework was tested on a building in the Pacific Northwest and was shown to be able to identify important behavior that relates to energy efficiency and occupant comfort.

  16. Liquid-liquid Extraction System Based on Non-ionic Surfactant-salt-H2O and Mechanism of Drug Extraction

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Extraction behavior of chlorpromazine hydrochloride (CPZ) and procaine hydro- chloride (PCN) in the system described in the title was studied.Research shows that the extraction efficiency of CPZ can amount to 96% by twice extraction,while that of PCN is 77%.This system produces the distribution coefficients (KD) of 12.3 and 2.6 respectively for CPZ and PCN.Extraction mechanism is deduced according to ultraviolet and molecular fluorescence spectra variation of the drugs in the system studied.

  17. A Robust Iris Identification System Based on Wavelet Packet Decomposition and Local Comparisons of the Extracted Signatures

    Science.gov (United States)

    Rossant, Florence; Mikovicova, Beata; Adam, Mathieu; Trocan, Maria

    2010-12-01

    This paper presents a complete iris identification system including three main stages: iris segmentation, signature extraction, and signature comparison. An accurate and robust pupil and iris segmentation process, taking into account eyelid occlusions, is first detailed and evaluated. Then, an original wavelet-packet-based signature extraction method and a novel identification approach, based on the fusion of local distance measures, are proposed. Performance measurements validating the proposed iris signature and demonstrating the benefit of our local-based signature comparison are provided. Moreover, an exhaustive evaluation of robustness, with regards to the acquisition conditions, attests the high performances and the reliability of our system. Tests have been conducted on two different databases, the well-known CASIA database (V3) and our ISEP database. Finally, a comparison of the performances of our system with the published ones is given and discussed.

  18. A Robust Iris Identification System Based on Wavelet Packet Decomposition and Local Comparisons of the Extracted Signatures

    Directory of Open Access Journals (Sweden)

    Rossant Florence

    2010-01-01

    Full Text Available Abstract This paper presents a complete iris identification system including three main stages: iris segmentation, signature extraction, and signature comparison. An accurate and robust pupil and iris segmentation process, taking into account eyelid occlusions, is first detailed and evaluated. Then, an original wavelet-packet-based signature extraction method and a novel identification approach, based on the fusion of local distance measures, are proposed. Performance measurements validating the proposed iris signature and demonstrating the benefit of our local-based signature comparison are provided. Moreover, an exhaustive evaluation of robustness, with regards to the acquisition conditions, attests the high performances and the reliability of our system. Tests have been conducted on two different databases, the well-known CASIA database (V3 and our ISEP database. Finally, a comparison of the performances of our system with the published ones is given and discussed.

  19. A Robust Iris Identification System Based on Wavelet Packet Decomposition and Local Comparisons of the Extracted Signatures

    Directory of Open Access Journals (Sweden)

    Maria Trocan

    2010-01-01

    Full Text Available This paper presents a complete iris identification system including three main stages: iris segmentation, signature extraction, and signature comparison. An accurate and robust pupil and iris segmentation process, taking into account eyelid occlusions, is first detailed and evaluated. Then, an original wavelet-packet-based signature extraction method and a novel identification approach, based on the fusion of local distance measures, are proposed. Performance measurements validating the proposed iris signature and demonstrating the benefit of our local-based signature comparison are provided. Moreover, an exhaustive evaluation of robustness, with regards to the acquisition conditions, attests the high performances and the reliability of our system. Tests have been conducted on two different databases, the well-known CASIA database (V3 and our ISEP database. Finally, a comparison of the performances of our system with the published ones is given and discussed.

  20. An Agent Based System Framework for Mining Data Record Extraction from Search Engine Result Pages

    Directory of Open Access Journals (Sweden)

    Dr.K.L Shunmuganathan

    2012-04-01

    Full Text Available Nowadays, the huge amount of information distributed through the Web motivates studying techniques to be adopted in order to extract relevant data in an efficient and reliable way. Information extraction (IE from semistructured Web documents plays an important role for a variety of information agents. In this paper, a framework of WebIE system with the help of the JADE platform is proposed to solve problems by non-visual automatic wrapper to extract data records from search engine results pages which contain important information for Meta search engine and computer users. It gives the idea about different agents used in WebIE and how the communication occurred between them and how to manage different agents. Multi Agent System (MAS provides an efficient way for communicating agents and it is decentralized. Prototype model is developed for the study purpose and how it is used to solve the complex problems arise into the WebIE. Our wrapper consists of a series of agent filter to detect and remove irrelevant data region from the web page. In this paper, we propose a highly effective and efficient algorithm for automatically mining result records from search engine responsepages.

  1. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes

    Energy Technology Data Exchange (ETDEWEB)

    Mousavi, Fatemeh; Pawliszyn, Janusz, E-mail: janusz@uwaterloo.ca

    2013-11-25

    Graphical abstract: -- Highlights: •Silica-based 1-vinyl-3-octadecylimidazolium bromide ionic liquid was synthesized and characterized. •The synthesized polymer was immobilized on the stainless steel blade using polyacrylonitrile glue. •SiImC{sub 18}-PAN 96-blade SPME was applied as an extraction phase for extraction of highly polar compounds in grape matrix. •This system provides high extraction efficiency and reproducibility for up to 50 extractions from tartaric buffer and 20 extractions from grape pulp. -- Abstract: 1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C{sub 18}VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC{sub 18}) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), {sup 13}C NMR and {sup 29}Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC–MS–MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270 min for 96 samples simultaneously (60 min preconditioning, 90 min extraction, 60 min desorption and 60 min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5–13 and 3–10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC–MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0 μg L{sup −1}, respectively. Standard addition calibration was

  2. Advanced integrated solvent extraction systems

    Energy Technology Data Exchange (ETDEWEB)

    Horwitz, E.P.; Dietz, M.L.; Leonard, R.A. [Argonne National Lab., IL (United States)

    1997-10-01

    Advanced integrated solvent extraction systems are a series of novel solvent extraction (SX) processes that will remove and recover all of the major radioisotopes from acidic-dissolved sludge or other acidic high-level wastes. The major focus of this effort during the last 2 years has been the development of a combined cesium-strontium extraction/recovery process, the Combined CSEX-SREX Process. The Combined CSEX-SREX Process relies on a mixture of a strontium-selective macrocyclic polyether and a novel cesium-selective extractant based on dibenzo 18-crown-6. The process offers several potential advantages over possible alternatives in a chemical processing scheme for high-level waste treatment. First, if the process is applied as the first step in chemical pretreatment, the radiation level for all subsequent processing steps (e.g., transuranic extraction/recovery, or TRUEX) will be significantly reduced. Thus, less costly shielding would be required. The second advantage of the Combined CSEX-SREX Process is that the recovered Cs-Sr fraction is non-transuranic, and therefore will decay to low-level waste after only a few hundred years. Finally, combining individual processes into a single process will reduce the amount of equipment required to pretreat the waste and therefore reduce the size and cost of the waste processing facility. In an ongoing collaboration with Lockheed Martin Idaho Technology Company (LMITCO), the authors have successfully tested various segments of the Advanced Integrated Solvent Extraction Systems. Eichrom Industries, Inc. (Darien, IL) synthesizes and markets the Sr extractant and can supply the Cs extractant on a limited basis. Plans are under way to perform a test of the Combined CSEX-SREX Process with real waste at LMITCO in the near future.

  3. Sensor-based auto-focusing system using multi-scale feature extraction and phase correlation matching.

    Science.gov (United States)

    Jang, Jinbeum; Yoo, Yoonjong; Kim, Jongheon; Paik, Joonki

    2015-03-10

    This paper presents a novel auto-focusing system based on a CMOS sensor containing pixels with different phases. Robust extraction of features in a severely defocused image is the fundamental problem of a phase-difference auto-focusing system. In order to solve this problem, a multi-resolution feature extraction algorithm is proposed. Given the extracted features, the proposed auto-focusing system can provide the ideal focusing position using phase correlation matching. The proposed auto-focusing (AF) algorithm consists of four steps: (i) acquisition of left and right images using AF points in the region-of-interest; (ii) feature extraction in the left image under low illumination and out-of-focus blur; (iii) the generation of two feature images using the phase difference between the left and right images; and (iv) estimation of the phase shifting vector using phase correlation matching. Since the proposed system accurately estimates the phase difference in the out-of-focus blurred image under low illumination, it can provide faster, more robust auto focusing than existing systems.

  4. Sensor-Based Auto-Focusing System Using Multi-Scale Feature Extraction and Phase Correlation Matching

    Directory of Open Access Journals (Sweden)

    Jinbeum Jang

    2015-03-01

    Full Text Available This paper presents a novel auto-focusing system based on a CMOS sensor containing pixels with different phases. Robust extraction of features in a severely defocused image is the fundamental problem of a phase-difference auto-focusing system. In order to solve this problem, a multi-resolution feature extraction algorithm is proposed. Given the extracted features, the proposed auto-focusing system can provide the ideal focusing position using phase correlation matching. The proposed auto-focusing (AF algorithm consists of four steps: (i acquisition of left and right images using AF points in the region-of-interest; (ii feature extraction in the left image under low illumination and out-of-focus blur; (iii the generation of two feature images using the phase difference between the left and right images; and (iv estimation of the phase shifting vector using phase correlation matching. Since the proposed system accurately estimates the phase difference in the out-of-focus blurred image under low illumination, it can provide faster, more robust auto focusing than existing systems.

  5. Textpresso: an ontology-based information retrieval and extraction system for biological literature.

    Directory of Open Access Journals (Sweden)

    Hans-Michael Müller

    2004-11-01

    Full Text Available We have developed Textpresso, a new text-mining system for scientific literature whose capabilities go far beyond those of a simple keyword search engine. Textpresso's two major elements are a collection of the full text of scientific articles split into individual sentences, and the implementation of categories of terms for which a database of articles and individual sentences can be searched. The categories are classes of biological concepts (e.g., gene, allele, cell or cell group, phenotype, etc. and classes that relate two objects (e.g., association, regulation, etc. or describe one (e.g., biological process, etc.. Together they form a catalog of types of objects and concepts called an ontology. After this ontology is populated with terms, the whole corpus of articles and abstracts is marked up to identify terms of these categories. The current ontology comprises 33 categories of terms. A search engine enables the user to search for one or a combination of these tags and/or keywords within a sentence or document, and as the ontology allows word meaning to be queried, it is possible to formulate semantic queries. Full text access increases recall of biological data types from 45% to 95%. Extraction of particular biological facts, such as gene-gene interactions, can be accelerated significantly by ontologies, with Textpresso automatically performing nearly as well as expert curators to identify sentences; in searches for two uniquely named genes and an interaction term, the ontology confers a 3-fold increase of search efficiency. Textpresso currently focuses on Caenorhabditis elegans literature, with 3,800 full text articles and 16,000 abstracts. The lexicon of the ontology contains 14,500 entries, each of which includes all versions of a specific word or phrase, and it includes all categories of the Gene Ontology database. Textpresso is a useful curation tool, as well as search engine for researchers, and can readily be extended to other

  6. Textpresso: an ontology-based information retrieval and extraction system for biological literature.

    Science.gov (United States)

    Müller, Hans-Michael; Kenny, Eimear E; Sternberg, Paul W

    2004-11-01

    We have developed Textpresso, a new text-mining system for scientific literature whose capabilities go far beyond those of a simple keyword search engine. Textpresso's two major elements are a collection of the full text of scientific articles split into individual sentences, and the implementation of categories of terms for which a database of articles and individual sentences can be searched. The categories are classes of biological concepts (e.g., gene, allele, cell or cell group, phenotype, etc.) and classes that relate two objects (e.g., association, regulation, etc.) or describe one (e.g., biological process, etc.). Together they form a catalog of types of objects and concepts called an ontology. After this ontology is populated with terms, the whole corpus of articles and abstracts is marked up to identify terms of these categories. The current ontology comprises 33 categories of terms. A search engine enables the user to search for one or a combination of these tags and/or keywords within a sentence or document, and as the ontology allows word meaning to be queried, it is possible to formulate semantic queries. Full text access increases recall of biological data types from 45% to 95%. Extraction of particular biological facts, such as gene-gene interactions, can be accelerated significantly by ontologies, with Textpresso automatically performing nearly as well as expert curators to identify sentences; in searches for two uniquely named genes and an interaction term, the ontology confers a 3-fold increase of search efficiency. Textpresso currently focuses on Caenorhabditis elegans literature, with 3,800 full text articles and 16,000 abstracts. The lexicon of the ontology contains 14,500 entries, each of which includes all versions of a specific word or phrase, and it includes all categories of the Gene Ontology database. Textpresso is a useful curation tool, as well as search engine for researchers, and can readily be extended to other organism

  7. A computational-grid based system for continental drainage network extraction using SRTM digital elevation models

    Science.gov (United States)

    Curkendall, David W.; Fielding, Eric J.; Pohl, Josef M.; Cheng, Tsan-Huei

    2003-01-01

    We describe a new effort for the computation of elevation derivatives using the Shuttle Radar Topography Mission (SRTM) results. Jet Propulsion Laboratory's (JPL) SRTM has produced a near global database of highly accurate elevation data. The scope of this database enables computing precise stream drainage maps and other derivatives on Continental scales. We describe a computing architecture for this computationally very complex task based on NASA's Information Power Grid (IPG), a distributed high performance computing network based on the GLOBUS infrastructure. The SRTM data characteristics and unique problems they present are discussed. A new algorithm for organizing the conventional extraction algorithms [1] into a cooperating parallel grid is presented as an essential component to adapt to the IPG computing structure. Preliminary results are presented for a Southern California test area, established for comparing SRTM and its results against those produced using the USGS National Elevation Data (NED) model.

  8. A Knowledge-Based System Approach for Extracting Abstractions from Service Oriented Architecture Artifacts

    Directory of Open Access Journals (Sweden)

    George Goehring

    2013-03-01

    Full Text Available Rule-based methods have traditionally been applied to develop knowledge-based systems that replicate expert performance on a deep but narrow problem domain. Knowledge engineers capture expert knowledge and encode it as a set of rules for automating the expert’s reasoning process to solve problems in a variety of domains. We describe the development of a knowledge-based system approach to enhance program comprehension of Service Oriented Architecture (SOA software. Our approach uses rule-based methods to automate the analysis of the set of artifacts involved in building and deploying a SOA composite application. The rules codify expert knowledge to abstract information from these artifacts to facilitate program comprehension and thus assist Software Engineers as they perform system maintenance activities. A main advantage of the knowledge-based approach is its adaptability to the heterogeneous and dynamically evolving nature of SOA environments.

  9. Concept relation extraction using Naïve Bayes classifier for ontology-based question answering systems

    Directory of Open Access Journals (Sweden)

    G. Suresh kumar

    2015-01-01

    Full Text Available Domain ontology is used as a reliable source of knowledge in information retrieval systems such as question answering systems. Automatic ontology construction is possible by extracting concept relations from unstructured large-scale text. In this paper, we propose a methodology to extract concept relations from unstructured text using a syntactic and semantic probability-based Naïve Bayes classifier. We propose an algorithm to iteratively extract a list of attributes and associations for the given seed concept from which the rough schema is conceptualized. A set of hand-coded dependency parsing pattern rules and a binary decision tree-based rule engine were developed for this purpose. This ontology construction process is initiated through a question answering process. For each new query submitted, the required concept is dynamically constructed, and ontology is updated. The proposed relation extraction method was evaluated using benchmark data sets. The performance of the constructed ontology was evaluated using gold standard evaluation and compared with similar well-performing methods. The experimental results reveal that the proposed approach can be used to effectively construct a generic domain ontology with higher accuracy. Furthermore, the ontology construction method was integrated into the question answering framework, which was evaluated using the entailment method.

  10. An AIL/IL-based liquid/liquid extraction system for the purification of His-tagged proteins.

    Science.gov (United States)

    Xu, Weiyuan; Cao, Huazhen; Ren, Guangwei; Xie, Hujun; Huang, Jianying; Li, Shijun

    2014-06-01

    A sorbent based on affinity ionic liquid (AIL), triazacyclononane-ionic liquid, was synthesized, characterized, and applied to the extraction of histidine (His)-tagged proteins from aqueous buffer to ionic liquid (IL) phase. The adsorbed His-tagged proteins could be back-extracted from the IL phase to the aqueous buffer with an imidazole solution. The specific binding of His-tagged proteins with AIL/IL could be affected by a few factors including the ionic strength and coordinated metal ions. In the case of His-tagged enhanced green fluorescent protein (EGFP), the maximum binding capacity of Cu(2+)-AIL/IL reached 2.58 μg/μmol under the optimized adsorption conditions. The eluted His-tagged EGFP kept fluorescent and remained active through the purification process. Moreover, a tandem extraction process successively using Cu(2+)-AIL/IL and Zn(2+)-AIL/IL systems was developed, which was proven very efficient to obtain the ultimate protein with a purity of about 90 %. An effective reclamation method for the AIL/IL extraction system was further established. The sorbent could be easily regenerated by removing metal ions with EDTA and the followed reimmobilization of metal ions. Easy handling of the presented M(2+)-AIL/IL system and highly specific ability to absorb His-tagged proteins make it attractive and potentially applicable in biomolecular separation.

  11. PI Passivity-Based Control for Maximum Power Extraction of a Wind Energy System with Guaranteed Stability Properties

    Science.gov (United States)

    Cisneros, Rafael; Gao, Rui; Ortega, Romeo; Husain, Iqbal

    2016-10-01

    The present paper proposes a maximum power extraction control for a wind system consisting of a turbine, a permanent magnet synchronous generator, a rectifier, a load and one constant voltage source, which is used to form the DC bus. We propose a linear PI controller, based on passivity, whose stability is guaranteed under practically reasonable assumptions. PI structures are widely accepted in practice as they are easier to tune and simpler than other existing model-based methods. Real switching based simulations have been performed to assess the performance of the proposed controller.

  12. Shape Effect of Electrochemical Chloride Extraction in Structural Reinforced Concrete Elements Using a New Cement-Based Anodic System

    Directory of Open Access Journals (Sweden)

    Jesús Carmona

    2015-05-01

    Full Text Available This article shows the research carried out by the authors focused on how the shape of structural reinforced concrete elements treated with electrochemical chloride extraction can affect the efficiency of this process. Assuming the current use of different anode systems, the present study considers the comparison of results between conventional anodes based on Ti-RuO2 wire mesh and a cement-based anodic system such as a paste of graphite-cement. Reinforced concrete elements of a meter length were molded to serve as laboratory specimens, to closely represent authentic structural supports, with circular and rectangular sections. Results confirm almost equal performances for both types of anode systems when electrochemical chloride extraction is applied to isotropic structural elements. In the case of anisotropic ones, such as rectangular sections with no uniformly distributed rebar, differences in electrical flow density were detected during the treatment. Those differences were more extreme for Ti-RuO2 mesh anode system. This particular shape effect is evidenced by obtaining the efficiencies of electrochemical chloride extraction in different points of specimens.

  13. New land-based method for surveying sandy shores and extracting DEMs: the INSHORE system.

    Science.gov (United States)

    Baptista, Paulo; Cunha, Telmo R; Matias, Ana; Gama, Cristina; Bernardes, Cristina; Ferreira, Oscar

    2011-11-01

    The INSHORE system (INtegrated System for High Operational REsolution in shore monitoring) is a land-base survey system designed and developed for the specific task of monitoring the evolution in time of sandy shores. This system was developed with two main objectives: (1) to produce highly accurate 3D coordinates of surface points (in the order of 0.02 to 0.03 m); and (2) to be extremely efficient in surveying a beach stretch of several kilometres. Previous tests have demonstrated that INSHORE systems fulfil such objectives. Now, the usefulness of the INSHORE system as a survey tool for the production of Digital Elevation Models (DEMs) of sandy shores is demonstrated. For this purpose, the comparison of DEMs obtained with the INSHORE system and with other relevant survey techniques is presented. This comparison focuses on the final DEM accuracy and also on the survey efficiency and its impact on the costs associated with regular monitoring programmes. The field survey method of the INSHORE system, based on profile networks, has a productivity of about 30 to 40 ha/h, depending on the beach surface characteristics. The final DEM precision, after interpolation of the global positioning system profile network, is approximately 0.08 to 0.12 m (RMS), depending on the profile network's density. Thus, this is a useful method for 3D representation of sandy shore surfaces and can permit, after interpolation, reliable calculations of volume and other physical parameters.

  14. An overview of the BioExtract Server: a distributed, Web-based system for genomic analysis.

    Science.gov (United States)

    Lushbough, C M; Brendel, V P

    2010-01-01

    Genome research is becoming increasingly dependent on access to multiple, distributed data sources, and bioinformatic tools. The importance of integration across distributed databases and Web services will continue to grow as the number of requisite resources expands. Use of bioinformatic workflows has seen considerable growth in recent years as scientific research becomes increasingly dependent on the analysis of large sets of data and the use of distributed resources. The BioExtract Server (http://bioextract.org) is a Web-based system designed to aid researchers in the analysis of distributed genomic data by providing a platform to facilitate the creation of bioinformatic workflows. Scientific workflows are created within the system by recording the analytic tasks preformed by researchers. These steps may include querying multiple data sources, saving query results as searchable data extracts, and executing local and Web-accessible analytic tools. The series of recorded tasks can be saved as a computational workflow simply by providing a name and description.

  15. Extraction of peptide tagged cutinase in detergent-based aqueous two-phase systems

    NARCIS (Netherlands)

    Rodenbrock, A.; Selber, K.; Egmond, M.R.; Kula, M.-R.

    2010-01-01

    Detergent-based aqueous two-phase systems have the advantage to require only one auxiliary chemical to induce phase separation above the cloud point. In a systematic study the efficiency of tryptophan-rich peptide tags was investigated to enhance the partitioning of an enzyme to the detergent-rich p

  16. Maximum Energy Extraction Control for Wind Power Generation Systems Based on the Fuzzy Controller

    Science.gov (United States)

    Kamal, Elkhatib; Aitouche, Abdel; Mohammed, Walaa; Sobaih, Abdel Azim

    2016-10-01

    This paper presents a robust controller for a variable speed wind turbine with a squirrel cage induction generator (SCIG). For variable speed wind energy conversion system, the maximum power point tracking (MPPT) is a very important requirement in order to maximize the efficiency. The system is nonlinear with parametric uncertainty and subject to large disturbances. A Takagi-Sugeno (TS) fuzzy logic is used to model the system dynamics. Based on the TS fuzzy model, a controller is developed for MPPT in the presence of disturbances and parametric uncertainties. The proposed technique ensures that the maximum power point (MPP) is determined, the generator speed is controlled and the closed loop system is stable. Robustness of the controller is tested via the variation of model's parameters. Simulation studies clearly indicate the robustness and efficiency of the proposed control scheme compared to other techniques.

  17. EEG-Based BCI System Using Adaptive Features Extraction and Classification Procedures

    Science.gov (United States)

    Mangia, Anna Lisa; Cappello, Angelo

    2016-01-01

    Motor imagery is a common control strategy in EEG-based brain-computer interfaces (BCIs). However, voluntary control of sensorimotor (SMR) rhythms by imagining a movement can be skilful and unintuitive and usually requires a varying amount of user training. To boost the training process, a whole class of BCI systems have been proposed, providing feedback as early as possible while continuously adapting the underlying classifier model. The present work describes a cue-paced, EEG-based BCI system using motor imagery that falls within the category of the previously mentioned ones. Specifically, our adaptive strategy includes a simple scheme based on a common spatial pattern (CSP) method and support vector machine (SVM) classification. The system's efficacy was proved by online testing on 10 healthy participants. In addition, we suggest some features we implemented to improve a system's “flexibility” and “customizability,” namely, (i) a flexible training session, (ii) an unbalancing in the training conditions, and (iii) the use of adaptive thresholds when giving feedback. PMID:27635129

  18. Testing the Self-Similarity Exponent to Feature Extraction in Motor Imagery Based Brain Computer Interface Systems

    Science.gov (United States)

    Rodríguez-Bermúdez, Germán; Sánchez-Granero, Miguel Ángel; García-Laencina, Pedro J.; Fernández-Martínez, Manuel; Serna, José; Roca-Dorda, Joaquín

    2015-12-01

    A Brain Computer Interface (BCI) system is a tool not requiring any muscle action to transmit information. Acquisition, preprocessing, feature extraction (FE), and classification of electroencephalograph (EEG) signals constitute the main steps of a motor imagery BCI. Among them, FE becomes crucial for BCI, since the underlying EEG knowledge must be properly extracted into a feature vector. Linear approaches have been widely applied to FE in BCI, whereas nonlinear tools are not so common in literature. Thus, the main goal of this paper is to check whether some Hurst exponent and fractal dimension based estimators become valid indicators to FE in motor imagery BCI. The final results obtained were not optimal as expected, which may be due to the fact that the nature of the analyzed EEG signals in these motor imagery tasks were not self-similar enough.

  19. A continuous-exchange cell-free protein synthesis system based on extracts from cultured insect cells.

    Directory of Open Access Journals (Sweden)

    Marlitt Stech

    Full Text Available In this study, we present a novel technique for the synthesis of complex prokaryotic and eukaryotic proteins by using a continuous-exchange cell-free (CECF protein synthesis system based on extracts from cultured insect cells. Our approach consists of two basic elements: First, protein synthesis is performed in insect cell lysates which harbor endogenous microsomal vesicles, enabling a translocation of de novo synthesized target proteins into the lumen of the insect vesicles or, in the case of membrane proteins, their embedding into a natural membrane scaffold. Second, cell-free reactions are performed in a two chamber dialysis device for 48 h. The combination of the eukaryotic cell-free translation system based on insect cell extracts and the CECF translation system results in significantly prolonged reaction life times and increased protein yields compared to conventional batch reactions. In this context, we demonstrate the synthesis of various representative model proteins, among them cytosolic proteins, pharmacological relevant membrane proteins and glycosylated proteins in an endotoxin-free environment. Furthermore, the cell-free system used in this study is well-suited for the synthesis of biologically active tissue-type-plasminogen activator, a complex eukaryotic protein harboring multiple disulfide bonds.

  20. Method and system to perform energy-extraction based active noise control

    Science.gov (United States)

    Kelkar, Atul (Inventor); Joshi, Suresh M. (Inventor)

    2009-01-01

    A method to provide active noise control to reduce noise and vibration in reverberant acoustic enclosures such as aircraft, vehicles, appliances, instruments, industrial equipment and the like is presented. A continuous-time multi-input multi-output (MIMO) state space mathematical model of the plant is obtained via analytical modeling and system identification. Compensation is designed to render the mathematical model passive in the sense of mathematical system theory. The compensated system is checked to ensure robustness of the passive property of the plant. The check ensures that the passivity is preserved if the mathematical model parameters are perturbed from nominal values. A passivity-based controller is designed and verified using numerical simulations and then tested. The controller is designed so that the resulting closed-loop response shows the desired noise reduction.

  1. Extraction systems of the SPS

    CERN Multimedia

    CERN PhotoLab

    1973-01-01

    A pair of prototype septum magnets for the extraction systems of the SPS. Each of the two extraction systems will contain eighteen of these septum magnets (eight with a 4 mm septum and ten with a 16 mm septum) mounted in pairs in nine vacuum tanks.

  2. EEG-Based BCI System Using Adaptive Features Extraction and Classification Procedures

    Directory of Open Access Journals (Sweden)

    Valeria Mondini

    2016-01-01

    Full Text Available Motor imagery is a common control strategy in EEG-based brain-computer interfaces (BCIs. However, voluntary control of sensorimotor (SMR rhythms by imagining a movement can be skilful and unintuitive and usually requires a varying amount of user training. To boost the training process, a whole class of BCI systems have been proposed, providing feedback as early as possible while continuously adapting the underlying classifier model. The present work describes a cue-paced, EEG-based BCI system using motor imagery that falls within the category of the previously mentioned ones. Specifically, our adaptive strategy includes a simple scheme based on a common spatial pattern (CSP method and support vector machine (SVM classification. The system’s efficacy was proved by online testing on 10 healthy participants. In addition, we suggest some features we implemented to improve a system’s “flexibility” and “customizability,” namely, (i a flexible training session, (ii an unbalancing in the training conditions, and (iii the use of adaptive thresholds when giving feedback.

  3. FIS/ANFIS Based Optimal Control for Maximum Power Extraction in Variable-speed Wind Energy Conversion System

    Science.gov (United States)

    Nadhir, Ahmad; Naba, Agus; Hiyama, Takashi

    An optimal control for maximizing extraction of power in variable-speed wind energy conversion system is presented. Intelligent gradient detection by fuzzy inference system (FIS) in maximum power point tracking control is proposed to achieve power curve operating near optimal point. Speed rotor reference can be adjusted by maximum power point tracking fuzzy controller (MPPTFC) such that the turbine operates around maximum power. Power curve model can be modelled by using adaptive neuro fuzzy inference system (ANFIS). It is required to simply well estimate just a few number of maximum power points corresponding to optimum generator rotor speed under varying wind speed, implying its training can be done with less effort. Using the trained fuzzy model, some estimated maximum power points as well as their corresponding generator rotor speed and wind speed are determined, from which a linear wind speed feedback controller (LWSFC) capable of producing optimum generator speed can be obtained. Applied to a squirrel-cage induction generator based wind energy conversion system, MPPTFC and LWSFC could maximize extraction of the wind energy, verified by a power coefficient stay at its maximum almost all the time and an actual power line close to a maximum power efficiency line reference.

  4. Design and development of single stage purification of papain using Ionic Liquid based aqueous two phase extraction system and its Partition coefficient studies

    Directory of Open Access Journals (Sweden)

    Senthilkumar Rathnasamy

    2013-04-01

    Full Text Available As an emerging trend in bioseparation, aqueous two phase extractions based on phosponium ionic liquid have been utilized in this work to extract papain from Carica papaya fruit latex and the same wascompared with conventional aqueous two phase extraction system. Factors affecting the partition coefficient of papain such as ionic liquid concentration, pH of the extraction system and temperature have been investigated. The optimization studies show that ionic liquid concentrations and pH are majorly influencing the phaseformations and papain partitioning. It reveals the importance of electrostatic and hydrophobic interactions in the papain partitioning. Purification studies performed on Gel Filtration Chromatography shows that 96% of the papain enzyme could be extracted with the phosponium based ionic liquid in a single stage extraction. The final fraction containing papain enzyme was confirmed by SDS Page analysis.

  5. Extraction of metals from metal ion-catechol-quaternary base systems.

    Science.gov (United States)

    Vrchlabský, M; Sommer, L

    1968-09-01

    Methods are given for the extraction of iron(III), molybdenum(VI), titanium(IV), niobium(V), vanadium(IV), uranium(VI) and tungsten(VI) as ternary complexes with catechol and a quaternary cation such as n-butyltriphenylphosphonium, n-propyltriphenylphosphonium, tetraphenylarsonium, cetylpyridinium, cetyltrimethylammonium and 2,3,5-triphenyltetrazolium, the solvent being chloroform. By use of masking agents and pH control, some of these elements can be separated from each other by this means.

  6. Ionic Liquid-Based Ultrasonic-Assisted Extraction of Secoisolariciresinol Diglucoside from Flaxseed (Linum usitatissimum L.) with Further Purification by an Aqueous Two-Phase System.

    Science.gov (United States)

    Tan, Zhi-Jian; Wang, Chao-Yun; Yang, Zi-Zhen; Yi, Yong-Jian; Wang, Hong-Ying; Zhou, Wan-Lai; Li, Fen-Fang

    2015-09-30

    In this work, a two-step extraction methodology of ionic liquid-based ultrasonic-assisted extraction (IL-UAE) and ionic liquid-based aqueous two-phase system (IL-ATPS) was developed for the extraction and purification of secoisolariciresinol diglucoside (SDG) from flaxseed. In the IL-UAE step, several kinds of ILs were investigated as the extractants, to identify the IL that affords the optimum extraction yield. The extraction conditions such as IL concentration, ultrasonic irradiation time, and liquid-solid ratio were optimized using response surface methodology (RSM). In the IL-ATPS step, ATPS formed by adding kosmotropic salts to the IL extract was used for further separation and purification of SDG. The most influential parameters (type and concentration of salt, temperature, and pH) were investigated to obtain the optimum extraction efficiency. The maximum extraction efficiency was 93.35% under the optimal conditions of 45.86% (w/w) IL and 8.27% (w/w) Na₂SO₄ at 22 °C and pH 11.0. Thus, the combination of IL-UAE and IL-ATPS makes up a simple and effective methodology for the extraction and purification of SDG. This process is also expected to be highly useful for the extraction and purification of bioactive compounds from other important medicinal plants.

  7. Search Using N-gram Technique Based Statistical Analysis for Knowledge Extraction in Case Based Reasoning Systems

    OpenAIRE

    Karthik, M. N.; Davis, Moshe

    2004-01-01

    Searching techniques for Case Based Reasoning systems involve extensive methods of elimination. In this paper, we look at a new method of arriving at the right solution by performing a series of transformations upon the data. These involve N-gram based comparison and deduction of the input data with the case data, using Morphemes and Phonemes as the deciding parameters. A similar technique for eliminating possible errors using a noise removal function is performed. The error tracking and elim...

  8. Inertial extraction system

    Energy Technology Data Exchange (ETDEWEB)

    Balepin, Vladimir; Castrogiovanni, Anthony; Girlea, Florin; Robertson, Andrew; Sforza, Pasquale

    2016-03-15

    Disclosed herein are supersonic separation systems that can be used for the removal of CO.sub.2 from a mixed gas stream. Also disclosed are methods for the separation and subsequent collection of solidified CO.sub.2 from a gas stream.

  9. An effective method for enhancing metal-ions' selectivity of ionic liquid-based extraction system: Adding water-soluble complexing agent.

    Science.gov (United States)

    Sun, Xiao Qi; Peng, Bo; Chen, Ji; Li, De Qian; Luo, Fang

    2008-01-15

    Selective extraction-separation of yttrium(III) from heavy lanthanides into 1-octyl-3-methylimidazolium hexafluorophosphate ([C(8)mim][PF(6)]) containing Cyanex 923 was achieved by adding a water-soluble complexing agent (EDTA) to aqueous phase. The simple and environmentally benign complexing method was proved to be an effective strategy for enhancing the selectivity of [C(n)mim][PF(6)]/[Tf(2)N]-based extraction system without increasing the loss of [C(n)mim](+).

  10. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes.

    Science.gov (United States)

    Mousavi, Fatemeh; Pawliszyn, Janusz

    2013-11-25

    1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C18VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC18) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), (13)C NMR and (29)Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC-MS-MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270min for 96 samples simultaneously (60min preconditioning, 90min extraction, 60min desorption and 60min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5-13 and 3-10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC-MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0μgL(-1), respectively. Standard addition calibration was applied for quantitative analysis of aminoacids from grape juice and the results were validated with solvent extraction (SE) technique.

  11. Ionic liquid-based ultrasound-assisted extraction and aqueous two-phase system for analysis of caffeoylquinic acids from Flos Lonicerae Japonicae.

    Science.gov (United States)

    Tan, Ting; Lai, Chang-Jiang-Sheng; OuYang, Hui; He, Ming-Zhen; Feng, Yulin

    2016-02-20

    In this work, an ionic liquid-based ultrasonic-assisted extraction (ILUAE) method was developed to extract caffeoylquinic acids (CQAs) from Flos Lonicerae Japonicae (FLJ). ILUAE parameters were optimized by response surface methodology, including IL concentration, ultrasonic time, and liquid-solid ratio. Optimized ILUAE approach gained the highest extraction yields of 28.53, 18.21, 3.84mg/g for 3-O-caffeoylquinic acid (C1), 3,5-di-O-caffeoylquinic acid (C2), 3,4-di-O-caffeoylquinic acid (C3), respectively. C1-C3 are the three most abundant CQAs compounds in FLJ. The method showed comparable extraction yield and shorter extraction time compared with conventional extraction techniques. Subsequently, an aqueous two-phase system (ATPS) was applied in extraction solutions. Two trace CQAs, 5-O-caffeoylquinic acid (C4) and 4,5-di-O-caffeoylquinic acid (C5), were significantly enriched with signal to noise values increasing from less than 10 to higher than 1475. The results indicated that ILUAE and ATPS are efficient and environmentally-friendly sample extraction and enrichment techniques for CQAs from herbal medicines.

  12. Classification of endometrial lesions by nuclear morphometry features extracted from liquid-based cytology samples: a system based on logistic regression model.

    Science.gov (United States)

    Zygouris, Dimitrios; Pouliakis, Abraham; Margari, Niki; Chrelias, Charalampos; Terzakis, Emmanouil; Koureas, Nikolaos; Panayiotides, Ioannis; Karakitsos, Petros

    2014-08-01

    To investigate the potential of a computerized system for the discrimination of benign from malignant endometrial nuclei and lesions. A total of 228 histologically confirmed liquid-based cytological smears were collected: 117 within normal limits cases, 66 malignant cases, 37 hyperplasias without atypia, and 8 cases of hyperplasia with atypia. From each case we extracted nuclear morphometric features from about 100 nuclei using a custom image analysis system. Initially we performed feature selection, and subsequently we applied a logistic regression model that classified each nucleus as benign or malignant. Based on the results of the nucleus classification process, we constructed an algorithm to discriminate endometrium cases as benign or malignant. The proposed system had an overall accuracy for the classification of endometrial nuclei equal to 83.02%, specificity of 85.09%, and sensitivity of 77.01%. For the case classification the overall accuracy was 92.98%, specificity was 92.86%, and sensitivity was 93.24%. The proposed computerized system can be applied for the classification of endometrial nuclei and lesions as it outperformed the standard cytological diagnosis. This study highlights interesting diagnostic features of endometrial nuclear morphology, and the proposed method can be a useful tool in the everyday practice of the cytological laboratory.

  13. DESIGNING AN EVENT EXTRACTION SYSTEM

    Directory of Open Access Journals (Sweden)

    Botond BENEDEK

    2017-06-01

    Full Text Available In the Internet world, the amount of information available reaches very high quotas. In order to find specific information, some tools were created that automatically scroll through the existing web pages and update their databases with the latest information on the Internet. In order to systematize the search and achieve a result in a concrete form, another step is needed for processing the information returned by the search engine and generating the response in a more organized form. Centralizing events of a certain type is useful first of all for creating a news service. Through this system we are pursuing a knowledge - events from the Internet documents - extraction system. The system will recognize events of a certain type (weather, sports, politics, text data mining, etc. depending on how it will be trained (the concept it has in the dictionary. These events can be provided to the user, or it can also extract the context in which the event occurred, to indicate the initial form in which the event was embedded.

  14. Design of slow extraction system for therapy synchrotron

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jin-Quan; SONG Ming-Tao; WEI Bao-Wen

    2009-01-01

    Based on the optimized design of the lattice for therapy synchrotron and considering the requirement of radiation therapy,the third order resonant extraction is adopted.Using the momentum-amplitude selection method,the extraction system is designed and optimized.An extraction efficiency of more than 97%and a momentum spread less than 0.11%are obtained.

  15. Extraction and Analysis of Inter-area Oscillation Using Improved Multi-signal Matrix Pencil Algorithm Based on Data Reduction in Power System

    Science.gov (United States)

    Liu, Cheng; Cai, Guowei; Yang, Deyou; Sun, Zhenglong

    2016-08-01

    In this paper, a robust online approach based on wavelet transform and matrix pencil (WTMP) is proposed to extract the dominant oscillation mode and parameters (frequency, damping, and mode shape) of a power system from wide-area measurements. For accurate and robust extraction of parameters, WTMP is verified as an effective identification algorithm for output-only modal analysis. First, singular value decomposition (SVD) is used to reduce the covariance signals obtained by natural excitation technique. Second, the orders and range of the corresponding frequency are determined by SVD from positive power spectrum matrix. Finally, the modal parameters are extracted from each mode of reduced signals using the matrix pencil algorithm in different frequency ranges. Compared with the original algorithm, the advantage of the proposed method is that it reduces computation data size and can extract mode shape. The effectiveness of the scheme, which is used for accurate extraction of the dominant oscillation mode and its parameters, is thoroughly studied and verified using the response signal data generated from 4-generator 2-area and 16-generator 5-area test systems.

  16. Different methods to select the best extraction system for solid-phase extraction.

    Science.gov (United States)

    Bielicka-Daszkiewicz, Katarzyna

    2015-02-01

    The optimization methods for planning a solid-phase extraction experiment are presented. These methods are based on a study of interactions between different parts of an extraction system. Determination of the type and strength of interaction depends on the physicochemical properties of the individual components of the system. The main parameters that determine the extraction properties are described in this work. The influence of sorbents' and solvents' polarity on extraction efficiency, Hansen solubility parameters and breakthrough volume determination on sorption and desorption extraction step are discussed.

  17. Automatic Melody Generation System with Extraction Feature

    Science.gov (United States)

    Ida, Kenichi; Kozuki, Shinichi

    In this paper, we propose the melody generation system with the analysis result of an existing melody. In addition, we introduce the device that takes user's favor in the system. The melody generation is done by pitch's being arranged best on the given rhythm. The best standard is decided by using the feature element extracted from existing music by proposed method. Moreover, user's favor is reflected in the best standard by operating some of the feature element in users. And, GA optimizes the pitch array based on the standard, and achieves the system.

  18. SNS EXTRACTION FAST KICKER SYSTEM DEVELOPMENT.

    Energy Technology Data Exchange (ETDEWEB)

    ZHANG,W.; SANDBERG,J.; LAMBIASE,R.; LEE,Y.Y.; LOCKEY,R.; MI,J.; NEHRING,T.; PAI,C.; TSOUPAS,N.; TUOZZOLO,J.; WARBURTON,D.; WEI,J.; RUST,K.; CUTLER,R.

    2003-06-15

    The SNS Extraction Fast Kicker System is a very high power, high repetition rate pulsed power system. It was design and developed at Brookhaven National Laboratory. This system will consist of fourteen identical high voltage, high current modulators, and their auxiliary control and charging systems. The modulators will drive fourteen extraction magnet sections located inside of the SNS accumulator ring. The required kicker field rise time is 200 ns, a pulse flattop of 700 ns, a pulse repetition rate of 60 pulse-per-second. A 2500 Ampere per modulator output is required to reach the extraction kicker magnetic field strength. This design features a Blumlein Pulse-Forming-Network based topology, a low beam impedance termination, a fast current switching thyratron, and low inductance capacitor banks. It has a maximum charging voltage of 50kV, an open circuit output of 100kV, and a designed maximum pulsed current output of 4kA per modulator. The overall system output will be multiple GVA with 60 Pulse-per-second repetition rate. A prototype modulator has been successfully built and tested well above the SNS requirement. The modulator system production is in progress.

  19. Liquid carry-over in an injection moulded all-polymer chip system for immiscible phase magnetic bead-based solid-phase extraction

    Energy Technology Data Exchange (ETDEWEB)

    Kistrup, Kasper, E-mail: kkis@nanotech.dtu.dk [Department of Micro- and Nanotechnology, Technical University of Denmark, DTU Nanotech, Building 345 East, DK-2800 Kongens Lyngby (Denmark); Skotte Sørensen, Karen, E-mail: karen@nanotech.dtu.dk [Department of Micro- and Nanotechnology, Technical University of Denmark, DTU Nanotech, Building 345 East, DK-2800 Kongens Lyngby (Denmark); Center for Integrated Point of Care Technologies (CiPoC), DELTA, Venlighedsvej 4, DK-2870 Hørsholm (Denmark); Wolff, Anders, E-mail: anders.wolff@nanotech.dtu.dk [Department of Micro- and Nanotechnology, Technical University of Denmark, DTU Nanotech, Building 345 East, DK-2800 Kongens Lyngby (Denmark); Fougt Hansen, Mikkel, E-mail: mikkel.hansen@nanotech.dtu.dk [Department of Micro- and Nanotechnology, Technical University of Denmark, DTU Nanotech, Building 345 East, DK-2800 Kongens Lyngby (Denmark)

    2015-04-15

    We present an all-polymer, single-use microfluidic chip system produced by injection moulding and bonded by ultrasonic welding. Both techniques are compatible with low-cost industrial mass-production. The chip is produced for magnetic bead-based solid-phase extraction facilitated by immiscible phase filtration and features passive liquid filling and magnetic bead manipulation using an external magnet. In this work, we determine the system compatibility with various surfactants. Moreover, we quantify the volume of liquid co-transported with magnetic bead clusters from Milli-Q water or a lysis-binding buffer for nucleic acid extraction (0.1 (v/v)% Triton X-100 in 5 M guanidine hydrochloride). A linear relationship was found between the liquid carry-over and mass of magnetic beads used. Interestingly, similar average carry-overs of 1.74(8) nL/µg and 1.72(14) nL/µg were found for Milli-Q water and lysis-binding buffer, respectively. - Highlights: • We present an all-polymer mass producible passive filled microfluidic chip system. • Rapid system fabrication is obtained by injection moulding and ultrasonic welding. • The system is made for single-use nucleic acid extraction using magnetic beads. • We systematically map compatibility of the chip system with various surfactants. • We quantify the volume carry-over of magnetic beads in water and 0.1% triton-X solution.

  20. Molecular and supramolecular speciations of solvent extraction systems based on malonamide and/or dialkyl-phosphoric acids for An(III)/Ln(III); Speciations moleculaire et supramoleculaire de systemes d'extraction liquide-liquide a base de malonamide et/ou d'acides dialkylphosphoriques pour la separation An(III)/Ln(III)

    Energy Technology Data Exchange (ETDEWEB)

    Gannaz, B

    2006-06-15

    The solvent extraction system used in the DIAMEX-SANEX process, developed for the actinide(III)/lanthanide(III) separation, is based on the use of mixtures of the malonamide DMDOHEMA and a dialkyl-phosphoric acid (HDEHP or HDHP), in hydrogenated tetra-propylene. The complexity of these systems urges on a novel approach to improve the conventional methods (thermodynamics, solvent extraction) which hardly explain the macroscopic behaviors observed (3. phase, over-stoichiometry). This approach combines studies on both supramolecular (VPO, SANS, SAXS) and molecular (liquid-liquid extraction, ESI-MS, IR, EXAFS) speciations of single extractant systems (DMDOHEMA or HDHP in in n-dodecane) and their mixture. In spite of safety constraints due to the handling of radio-material, they were used in the studies as much as possible, like for SAXS measurements on americium-containing samples, a worldwide first-time. In each of the investigated systems, actinides(III) and lanthanides(III) are extracted to the organic phase in polar cores of reversed micelles, the inner and outer-sphere compositions of which are proposed. Thus, the 4f and 5f cations are extracted by reversed micelles such as [(DMDOHEMA){sub 2}M(NO{sub 3}){sub 3}]{sub inn} (DMDOHEMA){sub x}(HNO{sub 3}){sub z}(H{sub 2}O){sub w}]{sub out} and M(DHP){sub 3}(HDHP){sub y-3}(H{sub 2}O){sub w} with y = 3 to 6, for the single extractant systems. In the case of the two extractants system, the less concentrated one acts like a co-surfactant regarding the mixed aggregate formation [(DMDOHEMA){sub 2}M(NO{sub 3}){sub 3-v}(DHP){sub v}]{sub inn} [(DMDOFIEMA){sub x}(HDHP){sub y}(HNO{sub 3})z(H{sub 2}O){sub w}]{sub out}. (author)

  1. Web-Based Information Extraction Technology

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Information extraction techniques on the Web are the current research hotspot. Now many information extraction techniques based on different principles have appeared and have different capabilities. We classify the existing information extraction techniques by the principle of information extraction and analyze the methods and principles of semantic information adding, schema defining,rule expression, semantic items locating and object locating in the approaches. Based on the above survey and analysis,several open problems are discussed.

  2. BUEES:a bottom-up event extraction system

    Institute of Scientific and Technical Information of China (English)

    Xiao DING; Bing QIN; Ting LIU

    2015-01-01

    Traditional event extraction systems focus mainly on event type identifi cation and event participant extraction based on pre-specifi ed event type paradigms and manually annotated corpora. However, different domains have different event type paradigms. When transferring to a new domain, we have to build a new event type paradigm and annotate a new corpus from scratch. This kind of conventional event extraction system requires massive human effort, and hence prevents event extraction from being widely applicable. In this paper, we present BUEES, a bottom-up event extraction system, which extracts events from the web in a completely unsupervised way. The system automatically builds an event type paradigm in the input corpus, and then proceeds to extract a large number of instance patterns of these events. Subsequently, the system extracts event arguments according to these patterns. By conducting a series of experiments, we demonstrate the good performance of BUEES and compare it to a state-of-the-art Chinese event extraction system, i.e., a supervised event extraction system. Experimental results show that BUEES performs comparably (5% higher F-measure in event type identifi cation and 3% higher F-measure in event argument extraction), but without any human effort.

  3. Sida tuberculata (Malvaceae): a study based on development of extractive system and in silico and in vitro properties.

    Science.gov (United States)

    da Rosa, H S; Salgueiro, A C F; Colpo, A Z C; Paula, F R; Mendez, A S L; Folmer, V

    2016-07-11

    Sida tuberculata (Malvaceae) is a medicinal plant traditionally used in Brazil as an antimicrobial and anti-inflammatory agent. Here, we aimed to investigate the different extractive techniques on phytochemical parameters, as well as to evaluate the toxicity and antioxidant capacity of S. tuberculata extracts using in silico and in vitro models. Therefore, in order to determine the dry residue content and the main compound 20-hydroxyecdysone (20E) concentration, extracts from leaves and roots were prepared testing ethanol and water in different proportions. Extracts were then assessed by Artemia salina lethality test, and toxicity prediction of 20E was estimated. Antioxidant activity was performed by DPPH and ABTS radical scavenger assays, ferric reducing power assay, nitrogen derivative scavenger, deoxyribose degradation, and TBARS assays. HPLC evaluation detected 20E as main compound in leaves and roots. Percolation method showed the highest concentrations of 20E (0.134 and 0.096 mg/mL of extract for leaves and roots, respectively). All crude extracts presented low toxic potential on A. salina (LD50 >1000 µg/mL). The computational evaluation of 20E showed a low toxicity prediction. For in vitro antioxidant tests, hydroethanolic extracts of leaves were most effective compared to roots. In addition, hydroethanolic extracts presented a higher IC50 antioxidant than aqueous extracts. TBARS formation was prevented by leaves hydroethanolic extract from 0.015 and 0.03 mg/mL and for roots from 0.03 and 0.3 mg/mL on egg yolk and rat tissue, respectively (P<0.05). These findings suggest that S. tuberculata extracts are a considerable source of ecdysteroids and possesses a significant antioxidant property with low toxic potential.

  4. Sida tuberculata (Malvaceae): a study based on development of extractive system and in silico and in vitro properties

    Science.gov (United States)

    da Rosa, H.S.; Salgueiro, A.C.F.; Colpo, A.Z.C.; Paula, F.R.; Mendez, A.S.L.; Folmer, V.

    2016-01-01

    Sida tuberculata (Malvaceae) is a medicinal plant traditionally used in Brazil as an antimicrobial and anti-inflammatory agent. Here, we aimed to investigate the different extractive techniques on phytochemical parameters, as well as to evaluate the toxicity and antioxidant capacity of S. tuberculata extracts using in silico and in vitro models. Therefore, in order to determine the dry residue content and the main compound 20-hydroxyecdysone (20E) concentration, extracts from leaves and roots were prepared testing ethanol and water in different proportions. Extracts were then assessed by Artemia salina lethality test, and toxicity prediction of 20E was estimated. Antioxidant activity was performed by DPPH and ABTS radical scavenger assays, ferric reducing power assay, nitrogen derivative scavenger, deoxyribose degradation, and TBARS assays. HPLC evaluation detected 20E as main compound in leaves and roots. Percolation method showed the highest concentrations of 20E (0.134 and 0.096 mg/mL of extract for leaves and roots, respectively). All crude extracts presented low toxic potential on A. salina (LD50 >1000 µg/mL). The computational evaluation of 20E showed a low toxicity prediction. For in vitro antioxidant tests, hydroethanolic extracts of leaves were most effective compared to roots. In addition, hydroethanolic extracts presented a higher IC50 antioxidant than aqueous extracts. TBARS formation was prevented by leaves hydroethanolic extract from 0.015 and 0.03 mg/mL and for roots from 0.03 and 0.3 mg/mL on egg yolk and rat tissue, respectively (P<0.05). These findings suggest that S. tuberculata extracts are a considerable source of ecdysteroids and possesses a significant antioxidant property with low toxic potential. PMID:27409335

  5. Research on Topic Content Extraction and Storage System Based on Web%基于Web的主题内容提取与存储系统研究

    Institute of Scientific and Technical Information of China (English)

    朱林

    2016-01-01

    In order to extract web page theme, this paper studied topic extraction and storage system based on Web, analyzes the key system design, architecture design, component technique, so that people can be based on an accurate and rapid extraction of WEB to the corresponding theme and improve the work efficiency of people.%为了快速提取网页中的主题内容,本文对基于Web的主题内容提取与存储系统进行了较为深入的研究,从系统总的设计思想、架构设计、各个组件的关键实现技术等方面进行了分析,使人们可以基于WEB准确且迅速的提取到相应的主题内容,大大提高人们的工作效率。

  6. A Robust Digital Watermark Extracting Method Based on Neural Network

    Institute of Scientific and Technical Information of China (English)

    GUOLihua; YANGShutang; LIJianhua

    2003-01-01

    Since watermark removal software, such as StirMark, has succeeded in washing watermarks away for most of the known watermarking systems, it is necessary to improve the robustness of watermarking systems. A watermark extracting method based on the error Back propagation (BP) neural network is presented in this paper, which can efficiently improve the robustness of watermarking systems. Experiments show that even if the watermarking systems are attacked by the StirMark software, the extracting method based on neural network can still efficiently extract the whole watermark information.

  7. Driver’s Fatigue Detection Based on Yawning Extraction

    Directory of Open Access Journals (Sweden)

    Nawal Alioua

    2014-01-01

    a real danger on road since it reduces driver capacity to react and analyze information. In this paper we propose an efficient and nonintrusive system for monitoring driver fatigue using yawning extraction. The proposed scheme uses face extraction based support vector machine (SVM and a new approach for mouth detection, based on circular Hough transform (CHT, applied on mouth extracted regions. Our system does not require any training data at any step or special cameras. Some experimental results showing system performance are reported. These experiments are applied over real video sequences acquired by low cost web camera and recorded in various lighting conditions.

  8. Interfacial chemistry in solvent extraction systems

    Energy Technology Data Exchange (ETDEWEB)

    Neuman, R.D.

    1993-01-01

    Research this past year continued to emphasize characterization of the physicochemical nature of the microscopic interfaces, i.e., reversed micelles and other association microstructures, which form in both practical and simplified acidic organophosphorus extraction systems associated with Ni, Co, and Na in order to improve on the model for aggregation of metal-extractant complexes. Also, the macroscopic interfacial behavior of model extractant (surfactant) molecules was further investigated. 1 fig.

  9. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    Smit, R

    2011-01-01

    This thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible and managea

  10. Fingerprint Feature Extraction Based on Macroscopic Curvature

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiong; He Gui-ming; Zhang Yun

    2003-01-01

    In the Automatic Fingerprint Identification System (AFIS), extracting the feature of fingerprint is very important. The local curvature of ridges of fingerprint is irregular, so people have the barrier to effectively extract the fingerprint curve features to describe fingerprint. This article proposes a novel algorithm; it embraces information of few nearby fingerprint ridges to extract a new characteristic which can describe the curvature feature of fingerprint. Experimental results show the algorithm is feasible, and the characteristics extracted by it can clearly show the inner macroscopic curve properties of fingerprint. The result also shows that this kind of characteristic is robust to noise and pollution.

  11. Fingerprint Feature Extraction Based on Macroscopic Curvature

    Institute of Scientific and Technical Information of China (English)

    Zhang; Xiong; He; Gui-Ming; 等

    2003-01-01

    In the Automatic Fingerprint Identification System(AFIS), extracting the feature of fingerprint is very important. The local curvature of ridges of fingerprint is irregular, so people have the barrier to effectively extract the fingerprint curve features to describe fingerprint. This article proposes a novel algorithm; it embraces information of few nearby fingerprint ridges to extract a new characterstic which can describe the curvature feature of fingerprint. Experimental results show the algorithm is feasible, and the characteristics extracted by it can clearly show the inner macroscopic curve properties of fingerprint. The result also shows that this kind of characteristic is robust to noise and pollution.

  12. Chinese Term Extraction Based on PAT Tree

    Institute of Scientific and Technical Information of China (English)

    ZHANG Feng; FAN Xiao-zhong; XU Yun

    2006-01-01

    A new method of automatic Chinese term extraction is proposed based on Patricia (PAT) tree. Mutual information is calculated based on prefix searching in PAT tree of domain corpus to estimate the internal associative strength between Chinese characters in a string. It can improve the speed of term candidate extraction largely compared with methods based on domain corpus directly. Common collocation suffix, prefix bank are constructed and term part of speech (POS) composing rules are summarized to improve the precision of term extraction. Experiment results show that the F-measure is 74.97 %.

  13. Development of a microfluidic-chip system for liquid-phase microextraction based on two immiscible organic solvents for the extraction and preconcentration of some hormonal drugs.

    Science.gov (United States)

    Asl, Yousef Abdossalami; Yamini, Yadollah; Seidi, Shahram

    2016-11-01

    In the present study, for the first time, an on-chip liquid phase microextraction (LPME) coupled with high performance liquid chromatography was introduced for the analysis of levonorgestrel (Levo), dydrogesterone (Dydo) and medroxyprogesterone (Medo) as the model analytes in biological samples. The chip-based LPME set-up was composed of two polymethyl methacrylate (PMMA) plates with microfabricated channels and a microporous membrane sandwiched between them to separate the sample solution and acceptor phase. These channels were used as a flow path for the sample solution and a thin compartment for the acceptor phase, respectively. In this system, two immiscible organic solvents were used as supported liquid membrane (SLM) and acceptor phase, respectively. During extraction, the model analytes in the sample solution were transported through the SLM (n-dodecane) into the acceptor organic solvent (methanol). The new set-up provided effective and reproducible extractions using low volumes of the sample solution. The effective parameters on the extraction efficiency of the model analytes were optimized using one variable at a time method. Under the optimized conditions, the new set-up provided good linearity in the range of 5.0-500µgL(-1) for the model analytes with the coefficients of determination (r(2)) higher than 0.9909. The relative standard deviations (RSDs%) and limits of detection (LODs) values were less than 6.5% (n=5) and 5.0µgL(-1), respectively. The preconcentration factors (PFs) were obtained using 1.0mL of the sample solution and 20.0µL of the acceptor solution higher than 19.9-fold. Finally, the proposed method was successfully applied for the extraction and determination of the model analytes in urine samples.

  14. Titulações espectrofotométricas de sistemas ácido-base utilizando extrato de flores contendo antocianinas Spectrophotometric titrations of acid-base systems using flower extracts containig anthocyanins

    Directory of Open Access Journals (Sweden)

    Mônica Souza Cortes

    2007-08-01

    Full Text Available Considering the attraction of the students' attention by the changes in the colors of vegetable crude extracts caused by the variation of the pH of the medium, the use of these different colors in order to demonstrate principles of spectrophotometric acid-base titrations using the crude extracts as indicators is proposed. The experimental setup consisted of a simple spectrophotometer, a homemade flow cell and a pump to propel the fluids along the system. Students should be stimulated to choose the best wavelength to monitor the changes in color during the titration. Since the pH of the equivalence point depends on the system titrated, the wavelength must be properly chosen to follow these changes, demonstrating the importance of the correct choice of the indicator. When compared with the potentiometric results, errors as low as 2% could be found using Rhododendron simsii (azalea or Tibouchina granulosa (Glory tree, quaresmeira as sources of the crude extracts.

  15. Liquid carry-over in an injection moulded all-polymer chip system for immiscible phase magnetic bead-based solid-phase extraction

    DEFF Research Database (Denmark)

    Kistrup, Kasper; Sørensen, Karen Skotte; Wolff, Anders;

    2014-01-01

    -binding buffer for nucleic acid extraction (0.1 (v/v)% Triton X-100 in 5 M guanidine hydrochloride). A linear relationship was found between the liquid carry-over and mass of magnetic beads used. Interestingly, similar average carry-overs of 1.74(8) nL/µg and 1.72(14) nL/µg were found for Milli-Q water and lysis......We present an all-polymer, single-use microfluidic chip system produced by injection moulding and bonded by ultrasonic welding. Both techniques are compatible with low-cost industrial mass-production. The chip is produced for magnetic bead-based solid-phase extraction facilitated by immiscible...... phase filtration and features passive liquid filling and magnetic bead manipulation using an external magnet. In this work, we determine the system compatibility with various surfactants. Moreover, we quantify the volume of liquid co-transported with magnetic bead clusters from Milli-Q water or a lysis...

  16. An Efficient Feature Extraction Method with Pseudo-Zernike Moment in RBF Neural Network-Based Human Face Recognition System

    Directory of Open Access Journals (Sweden)

    Ahmadi Majid

    2003-01-01

    Full Text Available This paper introduces a novel method for the recognition of human faces in digital images using a new feature extraction method that combines the global and local information in frontal view of facial images. Radial basis function (RBF neural network with a hybrid learning algorithm (HLA has been used as a classifier. The proposed feature extraction method includes human face localization derived from the shape information. An efficient distance measure as facial candidate threshold (FCT is defined to distinguish between face and nonface images. Pseudo-Zernike moment invariant (PZMI with an efficient method for selecting moment order has been used. A newly defined parameter named axis correction ratio (ACR of images for disregarding irrelevant information of face images is introduced. In this paper, the effect of these parameters in disregarding irrelevant information in recognition rate improvement is studied. Also we evaluate the effect of orders of PZMI in recognition rate of the proposed technique as well as RBF neural network learning speed. Simulation results on the face database of Olivetti Research Laboratory (ORL indicate that the proposed method for human face recognition yielded a recognition rate of 99.3%.

  17. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  18. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  19. Data-Wave-Based Features Extraction and Its Application in Symbol Identifier Recognition and Positioning Suitable for Multi-Robot Systems

    National Research Council Canada - National Science Library

    Xilong Liu; Hanbo Qian; Zhiqiang Cao; Chao Zhou; Yuequan Yang; Min Tan

    2012-01-01

    ...‐term which are detected based on ripple and wave filters. Supported by data‐wave, a novel symbol identifier with significant structure features is designed and these features are extracted by constructing pixel chains...

  20. Cryptographic Protocols Based on Root Extracting

    DEFF Research Database (Denmark)

    Koprowski, Maciej

    In this thesis we design new cryptographic protocols, whose security is based on the hardness of root extracting or more speci cally the RSA problem. First we study the problem of root extraction in nite Abelian groups, where the group order is unknown. This is a natural generalization of the...... complexity of root extraction, even if the algorithm can choose the "public exponent'' itself. In other words, both the standard and the strong RSA assumption are provably true w.r.t. generic algorithms. The results hold for arbitrary groups, so security w.r.t. generic attacks follows for any cryptographic...... construction based on root extracting. As an example of this, we modify Cramer-Shoup signature scheme such that it becomes a genericm algorithm. We discuss then implementing it in RSA groups without the original restriction that the modulus must be a product of safe primes. It can also be implemented in class...

  1. Application of Magnetic Bead-Based Nucleic Acid Automatic Extraction System in Molecular Biology%磁珠法核酸自动提取仪在分子生物学领域的应用

    Institute of Scientific and Technical Information of China (English)

    罗英

    2013-01-01

    The magnetic bead-based nucleic acid automatic extraction system can simply ,rapidly , efficiently ,economically and automatically extract nucleic acid from all kinds of samples . This paper summarizes the principle and classification of automatic nucleic acid extraction systems ,and the principle ,classification and characteristics of magnetic bead -based nucleic acid automatic extraction systems and their application in the field of molecular biology .%磁珠法核酸自动提取仪可以简单、快速、高效和经济地实现各种标本核酸的自动提取。本文概述了核酸自动提取仪的原理及分类,磁珠法核酸自动提取仪原理、分类、特点及其在分子生物学领域的应用。

  2. The Lecture Video Scene Extracting System

    OpenAIRE

    石黒, 信啓; 白井,治彦; 黒岩,丈介; 小高, 知宏; 小倉, 久和; ISHIGURO, Nobuhiko; SHIRAI, Haruhiko; KUROIWA, Josuke; ODAKA, Tomohiko; OGURA, Hisakazu

    2011-01-01

    In this paper, we propose the system of extracting feature of scenes in a lecture video. To avoid a hand work on a lecture video, we propose a new method automate the judgment of importance of a scene. This system uses the TF-IDF method that is the technique of the natural language processing. Our system has four functions to watch a lecture video efficiently. They are the function of extracting feature of scenes, character string choice, keyword search and important scene choice.These functi...

  3. Configurable impedance matching to maximise power extraction for enabling self-powered system based-on photovoltaic cells

    Science.gov (United States)

    Rahman, Airul Azha Abd; Jamil, Wan Adil Wan; Umar, Akrajas Ali

    2016-07-01

    Multivariate energy harvesting system, solar and thermal energies, with configurable impedance matching features is presented. The system consists of a tuneable mechanism for peak performance tracking. The inputs are voltages ranging from 20 mV to 3.1 V. The matching load is individually tuned for photovoltaic and thermoelectric power efficiency not less than 80% and 50% of the open circuit voltage respectively. Of experimentation and analysis has been done, the time it takes to fully charge up to 3.4 V is 23 minutes with the rate of charging is 1.8 mV/sec. Empirical data is presented. [Figure not available: see fulltext.

  4. A High-Efficiency Cellular Extraction System for Biological Proteomics.

    Science.gov (United States)

    Dhabaria, Avantika; Cifani, Paolo; Reed, Casie; Steen, Hanno; Kentsis, Alex

    2015-08-07

    Recent developments in quantitative high-resolution mass spectrometry have led to significant improvements in the sensitivity and specificity of the biochemical analyses of cellular reactions, protein-protein interactions, and small-molecule-drug discovery. These approaches depend on cellular proteome extraction that preserves native protein activities. Here, we systematically analyzed mechanical methods of cell lysis and physical protein extraction to identify those that maximize the extraction of cellular proteins while minimizing their denaturation. Cells were mechanically disrupted using Potter-Elvehjem homogenization, probe- or adaptive-focused acoustic sonication, and were in the presence of various detergents, including polyoxyethylene ethers and esters, glycosides, and zwitterions. Using fluorescence spectroscopy, biochemical assays, and mass spectrometry proteomics, we identified the combination of adaptive focused acoustic (AFA) sonication in the presence of a binary poloxamer-based mixture of octyl-β-glucoside and Pluronic F-127 to maximize the depth and yield of the proteome extraction while maintaining native protein activity. This binary poloxamer extraction system allowed for native proteome extraction comparable in coverage to the proteomes extracted using denaturing SDS or guanidine-containing buffers, including the efficient extraction of all major cellular organelles. This high-efficiency cellular extraction system should prove useful for a variety of cell biochemical studies, including structural and functional proteomics.

  5. Smart Extraction and Analysis System for Clinical Research.

    Science.gov (United States)

    Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung

    2017-05-01

    With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.

  6. Enhancement of Twins Fetal ECG Signal Extraction Based on Hybrid Blind Extraction Techniques

    Directory of Open Access Journals (Sweden)

    Ahmed Kareem Abdullah

    2017-07-01

    Full Text Available ECG machines are noninvasive system used to measure the heartbeat signal. It’s very important to monitor the fetus ECG signals during pregnancy to check the heat activity and to detect any problem early before born, therefore the monitoring of ECG signals have clinical significance and importance. For multi-fetal pregnancy case the classical filtering algorithms are not sufficient to separate the ECG signals between mother and fetal. In this paper the mixture consists of mixing from three ECG signals, the first signal is the mother ECG (M-ECG signal, second signal the Fetal-1 ECG (F1-ECG, and third signal is the Fetal-2 ECG (F2-ECG, these signals are extracted based on modified blind source extraction (BSE techniques. The proposed work based on hybridization between two BSE techniques to ensure that the extracted signals separated well. The results demonstrate that the proposed work very efficiently to extract the useful ECG signals

  7. An integrated rotary microfluidic system with DNA extraction, loop-mediated isothermal amplification, and lateral flow strip based detection for point-of-care pathogen diagnostics.

    Science.gov (United States)

    Park, Byung Hyun; Oh, Seung Jun; Jung, Jae Hwan; Choi, Goro; Seo, Ji Hyun; Kim, Do Hyun; Lee, Eun Yeol; Seo, Tae Seok

    2017-05-15

    Point-of-care (POC) molecular diagnostics plays a pivotal role for the prevention and treatment of infectious diseases. In spite of recent advancement in microfluidic based POC devices, there are still rooms for development to realize rapid, automatic and cost-effective sample-to-result genetic analysis. In this study, we propose an integrated rotary microfluidic system that is capable of performing glass microbead based DNA extraction, loop mediated isothermal amplification (LAMP), and colorimetric lateral flow strip based detection in a sequential manner with an optimized microfluidic design and a rotational speed control. Rotation direction-dependent coriolis force and siphon valving structures enable us to perform the fluidic control and metering, and the use of the lateral flow strip as a detection method renders all the analytical processes for nucleic acid test simplified and integrated without the need of expensive instruments or human intervention. As a proof of concept for point-of-care DNA diagnostics, we identified the food-borne bacterial pathogen which was contaminated in water or milk. Not only monoplex Salmonella Typhimurium but also multiplex Salmonella Typhimurium and Vibrio parahaemolyticus were analysed on the integrated rotary genetic analysis microsystem with a limit of detection of 50 CFU in 80min. In addition, three multiple samples were simultaneously analysed on a single device. The sample-to-result capability of the proposed microdevice provides great usefulness in the fields of clinical diagnostics, food safety and environment monitoring.

  8. Temporally rendered automatic cloud extraction (TRACE) system

    Science.gov (United States)

    Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.

    1999-10-01

    Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.

  9. Data-Wave-Based Features Extraction and Its Application in Symbol Identifier Recognition and Positioning Suitable for Multi-Robot Systems

    Directory of Open Access Journals (Sweden)

    Xilong Liu

    2012-12-01

    Full Text Available In this paper, feature extraction based on data-wave is proposed. The concept of data-wave is introduced to describe the rising and falling trends of the data over the long-term which are detected based on ripple and wave filters. Supported by data-wave, a novel symbol identifier with significant structure features is designed and these features are extracted by constructing pixel chains. On this basis, the corresponding recognition and positioning approach is presented. The effectiveness of the proposed approach is verified by experiments.

  10. Cryptographic Protocols Based on Root Extracting

    DEFF Research Database (Denmark)

    Koprowski, Maciej

    In this thesis we design new cryptographic protocols, whose security is based on the hardness of root extracting or more speci cally the RSA problem. First we study the problem of root extraction in nite Abelian groups, where the group order is unknown. This is a natural generalization of the...... construction based on root extracting. As an example of this, we modify Cramer-Shoup signature scheme such that it becomes a genericm algorithm. We discuss then implementing it in RSA groups without the original restriction that the modulus must be a product of safe primes. It can also be implemented in class......,  providing a currently acceptable level of security. This allows us to propose the rst practical blind signature scheme provably secure, without relying on heuristics called random oracle model (ROM). We obtain the protocol for issuing blind signatures by implementing our modi ed Fischlin's signing algorithm...

  11. Interaction of Plant Extracts with Central Nervous System Receptors

    Directory of Open Access Journals (Sweden)

    Kenneth Lundstrom

    2017-02-01

    Full Text Available Background: Plant extracts have been used in traditional medicine for the treatment of various maladies including neurological diseases. Several central nervous system receptors have been demonstrated to interact with plant extracts and components affecting the pharmacology and thereby potentially playing a role in human disease and treatment. For instance, extracts from Hypericum perforatum (St. John’s wort targeted several CNS receptors. Similarly, extracts from Piper nigrum, Stephania cambodica, and Styphnolobium japonicum exerted inhibition of agonist-induced activity of the human neurokinin-1 receptor. Methods: Different methods have been established for receptor binding and functional assays based on radioactive and fluorescence-labeled ligands in cell lines and primary cell cultures. Behavioral studies of the effect of plant extracts have been conducted in rodents. Plant extracts have further been subjected to mood and cognition studies in humans. Results: Mechanisms of action at molecular and cellular levels have been elucidated for medicinal plants in support of standardization of herbal products and identification of active extract compounds. In several studies, plant extracts demonstrated affinity to a number of CNS receptors in parallel indicating the complexity of this interaction. In vivo studies showed modifications of CNS receptor affinity and behavioral responses in animal models after treatment with medicinal herbs. Certain plant extracts demonstrated neuroprotection and enhanced cognitive performance, respectively, when evaluated in humans. Noteworthy, the penetration of plant extracts and their protective effect on the blood-brain-barrier are discussed. Conclusion: The affinity of plant extracts and their isolated compounds for CNS receptors indicates an important role for medicinal plants in the treatment of neurological disorders. Moreover, studies in animal and human models have confirmed a scientific basis for the

  12. Selective electromembrane extraction based on isoelectric point

    DEFF Research Database (Denmark)

    Huang, Chuixiu; Gjelstad, Astrid; Pedersen-Bjergaard, Stig

    2015-01-01

    For the first time, selective isolation of a target peptide based on the isoelectric point (pI) was achieved using a two-step electromembrane extraction (EME) approach with a thin flat membrane-based EME device. In this approach, step #1 was an extraction process, where both the target peptide...... angiotensin II antipeptide (AT2 AP, pI=5.13) and the matrix peptides (pI>5.13) angiotensin II (AT2), neurotensin (NT), angiotensin I (AT1) and leu-enkephalin (L-Enke) were all extracted as net positive species from the sample (pH 3.50), through a supported liquid membrane (SLM) of 1-nonanol diluted with 2......, and the target remained in the acceptor solution. The acceptor solution pH, the SLM composition, the extraction voltage, and the extraction time during the clean-up process (step #2) were important factors influencing the separation performance. An acceptor solution pH of 5.25 for the clean-up process slightly...

  13. Phase extraction based on sinusoidal extreme strip phase shifting method

    Science.gov (United States)

    Hui, Mei; Liu, Ming; Dong, Liquan; Liu, Xiaohua; Zhao, Yuejin

    2015-08-01

    Multiple synthetic aperture imaging can enlarge pupil diameter of optical systems, and increase system resolution. Multiple synthetic aperture imaging is a cutting-edge topic and research focus in recent years, which is prospectively widely applied in fields like astronomical observations and aerospace remote sensing. In order to achieve good imaging quality, synthetic aperture imaging system requires phase extraction of each sub-aperture and co-phasing of whole aperture. In the project, an in-depth study about basic principles and methods of segments phase extraction was done. The study includes: application of sinusoidal extreme strip light irradiation phase shift method to extract the central dividing line to get segment phase extraction information, and the use of interference measurement to get the aperture phase extraction calibration coefficients of spherical surface. Study about influence of sinusoidal extreme strip phase shift on phase extraction, and based on sinusoidal stripe phase shift from multiple linear light sources of the illumination reflected image, to carry out the phase shift error for inhibiting the effect in the phase extracted frame.

  14. Rapid, highly efficient extraction and purification of membrane proteins using a microfluidic continuous-flow based aqueous two-phase system.

    Science.gov (United States)

    Hu, Rui; Feng, Xiaojun; Chen, Pu; Fu, Meng; Chen, Hong; Guo, Lin; Liu, Bi-Feng

    2011-01-07

    Membrane proteins play essential roles in regulating various fundamental cellular functions. To investigate membrane proteins, extraction and purification are usually prerequisite steps. Here, we demonstrated a microfluidic aqueous PEG/detergent two-phase system for the purification of membrane proteins from crude cell extract, which replaced the conventional discontinuous agitation method with continuous extraction in laminar flows, resulting in significantly increased extraction speed and efficiency. To evaluate this system, different separation and detection methods were used to identify the purified proteins, such as capillary electrophoresis, SDS-PAGE and nano-HPLC-MS/MS. Swiss-Prot database with Mascot search engine was used to search for membrane proteins from random selected bands of SDS-PAGE. Results indicated that efficient purification of membrane proteins can be achieved within 5-7s and approximately 90% of the purified proteins were membrane proteins (the highest extraction efficiency reported up to date), including membrane-associated proteins and integral membrane proteins with multiple transmembrane domains. Compared to conventional approaches, this new method had advantages of greater specific surface area, minimal emulsification, reduced sample consumption and analysis time. We expect the developed method to be potentially useful in membrane protein purifications, facilitating the investigation of membrane proteomics.

  15. A moving baseline for evaluation of advanced coal extraction systems

    Science.gov (United States)

    Bickerton, C. R.; Westerfield, M. D.

    1981-01-01

    Results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000 are reported. Systems used were selected from contemporary coal mining technology and from conservation conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam extended to other seam thicknesses.

  16. Extracting clinical information to support medical decision based on standards.

    Science.gov (United States)

    Gomoi, Valentin; Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Stoicu-Tivadar, Vasile

    2011-01-01

    The paper presents a method connecting medical databases to a medical decision system, and describes a service created to extract the necessary information that is transferred based on standards. The medical decision can be improved based on many inputs from different medical locations. The developed solution is described for a concrete case concerning the management for chronic pelvic pain, based on the information retrieved from diverse healthcare databases.

  17. 21 CFR 172.585 - Sugar beet extract flavor base.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Sugar beet extract flavor base. 172.585 Section 172... CONSUMPTION Flavoring Agents and Related Substances § 172.585 Sugar beet extract flavor base. Sugar beet...) Sugar beet extract flavor base is the concentrated residue of soluble sugar beet extractives from...

  18. Towards Task-Based Temporal Extraction and Recognition

    NARCIS (Netherlands)

    Ahn, D.D.; Fissaha Adafre, S.; de Rijke, M.

    2005-01-01

    We seek to improve the robustness and portability of temporal information extraction systems by incorporating data-driven techniques. We present two sets of experiments pointing us in this direction. The first shows that machine-learning-based recognition of temporal expressions not only achieves

  19. MBA: a literature mining system for extracting biomedical abbreviations

    Directory of Open Access Journals (Sweden)

    Lei YiMing

    2009-01-01

    Full Text Available Abstract Background The exploding growth of the biomedical literature presents many challenges for biological researchers. One such challenge is from the use of a great deal of abbreviations. Extracting abbreviations and their definitions accurately is very helpful to biologists and also facilitates biomedical text analysis. Existing approaches fall into four broad categories: rule based, machine learning based, text alignment based and statistically based. State of the art methods either focus exclusively on acronym-type abbreviations, or could not recognize rare abbreviations. We propose a systematic method to extract abbreviations effectively. At first a scoring method is used to classify the abbreviations into acronym-type and non-acronym-type abbreviations, and then their corresponding definitions are identified by two different methods: text alignment algorithm for the former, statistical method for the latter. Results A literature mining system MBA was constructed to extract both acronym-type and non-acronym-type abbreviations. An abbreviation-tagged literature corpus, called Medstract gold standard corpus, was used to evaluate the system. MBA achieved a recall of 88% at the precision of 91% on the Medstract gold-standard EVALUATION Corpus. Conclusion We present a new literature mining system MBA for extracting biomedical abbreviations. Our evaluation demonstrates that the MBA system performs better than the others. It can identify the definition of not only acronym-type abbreviations including a little irregular acronym-type abbreviations (e.g., , but also non-acronym-type abbreviations (e.g., .

  20. Optimisation of the extraction of olive (Olea europaea) leaf phenolics using water/ethanol-based solvent systems and response surface methodology.

    Science.gov (United States)

    Mylonaki, Stefania; Kiassos, Elias; Makris, Dimitris P; Kefalas, Panagiotis

    2008-11-01

    An experimental setup based on a 2(3) full-factorial, central-composite design was implemented with the aim of optimising the recovery of polyphenols from olive leaves by employing reusable and nontoxic solutions composed of water/ethanol/citric acid as extracting media. The factors considered were (i) the pH of the medium, (ii) the extraction time and (iii) the ethanol concentration. The model obtained produced a satisfactory fit to the data with regard to total polyphenol extraction (R(2) = 0.91, p = 0.0139), but not for the antiradical activity of the extracts (R(2) = 0.67, p = 0.3734). The second-order polynomial equation obtained after analysing the experimental data indicated that ethanol concentration and time mostly affected the extraction yield, but that increased pH values were unfavourable in this regard. The maximum theoretical yield was calculated to be 250.2 +/- 76.8 mg gallic acid equivalent per g of dry, chlorophyll-free tissue under optimal conditions (60% EtOH, pH 2 and 5 h). Liquid chromatography-electrospray ionisation mass spectrometry of the optimally obtained extract revealed that the principal phytochemicals recovered were luteolin 7-O-glucoside, apigenin 7-O-rutinoside and oleuropein, accompanied by smaller amounts of luteolin 3',7-O-diglucoside, quercetin 3-O-rutinoside (rutin), luteolin 7-O-rutinoside and luteolin 3'-O-glucoside. Simple linear regression analysis between the total polyphenol and antiradical activity values gave a low and statistically insignificant correlation (R(2) = 0.273, p > 0.05), suggesting that it is not the sheer amount of polyphenols that provides high antioxidant potency; instead, this potency is probably achieved through interactions among the various phenolic constituents.

  1. Aqueous Two-phase Systems with Ultrasonic Extraction Used for Extracting Phenolic Compounds from Inonotus obliquus

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yan-xia; LIU Yu-bing; LIU Feng; ZHENG Wei-fa

    2013-01-01

    Objective To optimize the extracting technology of assessing the maximum yield of phenolic compounds (PC)from Inonotus obliquus by single factor experiments and orthogonal array design methods through aqueous two-phase systems combined with ultrasonic extraction.Methods The range of the independent variables,namely levels of acetone and ammonium sulfate,and ultrasonic time were identified by a first set of single factor experiments.The actual values of the independent variables coded at four levels and three factors were selected based on the results of the single factor experiments.Subsequently,the levels of acetone and ammonium sulfate,and ultrasonic time were optimized using the orthogonal array method.Results The optimum conditions for the extraction of PC were found to use 7.0 mL acetone,5.5 mg ammonium sulfate,with ultrasonic time for 5 min.Under these optimized conditions,the experimental maximum yield of PC was 37.8 mg/g,much higher than that of the traditional ultrasonic extraction (UE,29.0 mg/g).And the PC obtained by this method had stronger anti-oxidative activities than those by traditional UE method.Conclusion These results indicate the suitability of the models developed and the success in optimizing the extraction conditions.This is an economical and efficient method for extracting polyphenols from Ⅰ.obliquus.

  2. Aqueous Two-phase Systems with Ultrasonic Extraction Used for Extracting Phenolic Compounds from Inonotus obliquus

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Yan-xia; LIU; Yu-bing; LIU; Feng; ZHENG; Wei-fa

    2013-01-01

    Objective To optimize the extracting technology of assessing the maximum yield of phenolic compounds (PC) from Inonotus obliquus by single factor experiments and orthogonal array design methods through aqueous two-phase systems combined with ultrasonic extraction. Methods The range of the independent variables, namely levels of acetone and ammonium sulfate, and ultrasonic time were identified by a first set of single factor experiments. The actual values of the independent variables coded at four levels and three factors were selected based on the results of the single factor experiments. Subsequently, the levels of acetone and ammonium sulfate, and ultrasonic time were optimized using the orthogonal array method. Results The optimum conditions for the extraction of PC were found to use 7.0 mL acetone, 5.5 mg ammonium sulfate, with ultrasonic time for 5 min. Under these optimized conditions, the experimental maximum yield of PC was 37.8 mg/g, much higher than that of the traditional ultrasonic extraction (UE, 29.0 mg/g). And the PC obtained by this method had stronger anti-oxidative activities than those by traditional UE method. Conclusion These results indicate the suitability of the models developed and the success in optimizing the extraction conditions. This is an economical and efficient method for extracting polyphenols from I. obliquus.

  3. Mechanism of gold solvent extraction from aurocyanide solution by quaternary amines: models of extracting species based on hydrogen bonding

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The mechanism of gold solvent extraction from KAu(CN)2 solution was investigated by means of FTIR, EXAFS, ICP and radioactive tracer methods. Two extraction systems were studied, namely N263-tributyl phosphate(TBP)-n-dodecane and N263-iso-octanol-n-dodecane. High-reso- lution FT IR spectroscopy indicated that the CN stretching vibrations of the two extraction systems differred greatly. In order to interpret the significant difference in CN stretching vibrations, two extracting species models are proposed supramolecular structures based on the formation of hydrogen bonds between Au(CN)2- and modifiers such as TBP and iso-octanol.

  4. Correlation between aggregation and extracting properties in solvent extraction systems: extraction of actinides (III) and lanthanides (III) by a malonamide in non acidic media

    Energy Technology Data Exchange (ETDEWEB)

    Meridiano, Y.; Berthon, L.; Lagrave, S.; Crozes, X.; Sorel, C.; Testard, F.; Zemb, T. [CEA Marcoule, DEN/DRCP/SCPS/LCSE, 30207Bagnols sur Ceze (France)

    2008-07-01

    The organic phases of the DIAMEX (Diamide Extraction) process, allowing the co-extraction of actinides(III) and lanthanides(III) from high level radioactive wastes using a malonamide extractant molecule (DMDOHEMA) diluted in alkanes, are investigated. The aim of this study is to establish a link between different structures/organizations of diamide extractants and their extracting properties towards An(III) and Ln(III) cations. It is demonstrated that diamide, which are amphiphilic molecules, are organized in different structures (monomers, reverse micelles, lamellar phases..). This study deals with the effect of the composition of the extracting system on the extracting and aggregation properties of the DMDOHEMA solutions. The effects of the extractant (DMDOHEMA diluted in n-heptane) and metal concentrations (for a given extractant concentration) from a LiNO{sub 3} aqueous phase are investigated at two scales: at the supra-molecular scale by characterizing the aggregation by vapor-pressure osmometry (VPO) and small angle neutron and X-ray scattering (SANS and SAXS) experiments, and at the molecular scale by quantifying the extracted solutes (metal nitrate and water) and by determining the stoichiometries of the extracted complexes by electro-spray mass spectrometry (ESI-MS). The extraction equilibria can then be modeled by two approaches: a classical approach in solvent extraction based on mass action laws to determine extraction equilibria and their associated thermodynamic constants, and a physical chemical approach which consists in considering the extracted ions as adsorbed on a specific available surface of the extractant molecule. Thus, the extraction equilibrium can be considered as a sum of Langmuir isotherms corresponding to the different states of aggregation. The resulting constants are representative of both extraction efficiency and organic phase structure. (authors)

  5. McClellan Air Force Base operable unit B, two-phase extraction system demonstration test, work implementation plan for McClellan AFB, California. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-03

    This document is an integrated demonstration and work plan that presents the technical approach for design, implementation, and testing of two-phase extraction as compared with pump and treat technology in Operable Unit B, investigative cluster IC1 at the McClellan Air Force Base. This work is being coordinated with Clean Sites under a cooperative agreement with EPA's Technology Innovation Office and Superfund Innovative Technology Evaluation Program.

  6. Extraction of fetal electrocardiogram (ECG) by extended state Kalman filtering and adaptive neuro-fuzzy inference system (ANFIS) based on single channel abdominal recording

    Indian Academy of Sciences (India)

    D Panigrahy; P K Sahu

    2015-06-01

    Fetal electrocardiogram (ECG) gives information about the health status of fetus and so, an early diagnosis of any cardiac defect before delivery increases the effectiveness of appropriate treatment. In this paper, authors investigate the use of adaptive neuro-fuzzy inference system (ANFIS) with extended Kalman filter for fetal ECG extraction from one ECG signal recorded at the abdominal areas of the mother’s skin. The abdominal ECG is considered to be composite as it contains both mother’s and fetus’ ECG signals. We use extended Kalman filter framework to estimate the maternal component from abdominal ECG. The maternal component in the abdominal ECG signal is a nonlinear transformed version of maternal ECG. ANFIS network has been used to identify this nonlinear relationship, and to align the estimated maternal ECG signal with the maternal component in the abdominal ECG signal. Thus, we extract the fetal ECG component by subtracting the aligned version of the estimated maternal ECG from the abdominal signal. Our results demonstrate the effectiveness of the proposed technique in extracting the fetal ECG component from abdominal signal at different noise levels. The proposed technique is also validated on the extraction of fetal ECG from both actual abdominal recordings and synthetic abdominal recording.

  7. Advanced integrated solvent extraction and ion exchange systems

    Energy Technology Data Exchange (ETDEWEB)

    Horwitz, P. [Argonne National Lab., IL (United States)

    1996-10-01

    Advanced integrated solvent extraction (SX) and ion exchange (IX) systems are a series of novel SX and IX processes that extract and recover uranium and transuranics (TRUs) (neptunium, plutonium, americium) and fission products {sup 90}Sr, {sup 99}Tc, and {sup 137}Cs from acidic high-level liquid waste and that sorb and recover {sup 90}Sr, {sup 99}Tc, and {sup 137}Cs from alkaline supernatant high-level waste. Each system is based on the use of new selective liquid extractants or chromatographic materials. The purpose of the integrated SX and IX processes is to minimize the quantity of waste that must be vitrified and buried in a deep geologic repository by producing raffinates (from SX) and effluent streams (from IX) that will meet the specifications of Class A low-level waste.

  8. A feature extraction technique based on character geometry for character recognition

    CERN Document Server

    Gaurav, Dinesh Dileep

    2012-01-01

    This paper describes a geometry based technique for feature extraction applicable to segmentation-based word recognition systems. The proposed system extracts the geometric features of the character contour. This features are based on the basic line types that forms the character skeleton. The system gives a feature vector as its output. The feature vectors so generated from a training set, were then used to train a pattern recognition engine based on Neural Networks so that the system can be benchmarked.

  9. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  10. A Block-Based Multi-Scale Background Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Seyed H. Davarpanah

    2010-01-01

    Full Text Available Problem statement: To extract the moving objects, vision-based surveillance systems subtract the current image from a predefined background image. The efficiency of these systems mainly depends on accuracy of the extracted background image. It should be able to adapt to the changes continuously. In addition, especially in real-time applications the time complexity of this adaptation is a critical matter. Approach: In this study, to extract an adaptive background, a combination of blocking and multi-scale methods is presented. Because of being less sensitive to local movements, block-based techniques are proper to control the non-stationary objects’ movements, especially in outdoor applications. They can be useful to reduce the effect of these objects on the extracted background. We also used the blocking method to intelligently select the regions which the temporal filtering has to be applied on. In addition, an amended multi-scale algorithm is introduced. This algorithm is a hybrid algorithm, a combination of some nonparametric and parametric filters. It uses a nonparametric filter in the spatial domain to initiate two primary backgrounds. In continue two adapted two-dimensional filters will be used to extract the final background. Results: The qualitative and quantitative results of our experiments certify not only the quality of the final extracted background is acceptable, but also its time consumption is approximately half in compare to the similar methods. Conclusion: Using Multi scaling filtering and applying the filters just to some selected nonoverlapped blocks reduce the time consumption of the extracting background algorithm.

  11. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  12. Extracting the information backbone in online system

    CERN Document Server

    Zhang, Qian-Ming; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers mainly dedicated to improve the recommendation performance (accuracy and diversity) of the algorithms while overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improve both of...

  13. Building Extraction from LIDAR Based Semantic Analysis

    Institute of Scientific and Technical Information of China (English)

    YU Jie; YANG Haiquan; TAN Ming; ZHANG Guoning

    2006-01-01

    Extraction of buildings from LIDAR data has been an active research field in recent years. A scheme for building detection and reconstruction from LIDAR data is presented with an object-oriented method which is based on the buildings' semantic rules. Two key steps are discussed: how to group the discrete LIDAR points into single objects and how to establish the buildings' semantic rules. In the end, the buildings are reconstructed in 3D form and three common parametric building models (flat, gabled, hipped) are implemented.

  14. Phenolic-compound-extraction systems for fruit and vegetable samples.

    Science.gov (United States)

    Garcia-Salas, Patricia; Morales-Soto, Aranzazu; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2010-12-03

    This paper reviews the phenolic-compound-extraction systems used to analyse fruit and vegetable samples over the last 10 years. Phenolic compounds are naturally occurring antioxidants, usually found in fruits and vegetables. Sample preparation for analytical studies is necessary to determine the polyphenolic composition in these matrices. The most widely used extraction system is liquid-liquid extraction (LLE), which is an inexpensive method since it involves the use of organic solvents, but it requires long extraction times, giving rise to possible extract degradation. Likewise, solid-phase extraction (SPE) can be used in liquid samples. Modern techniques, which have been replacing conventional ones, include: supercritical fluid extraction (SFE), pressurized liquid extraction (PLE), microwave-assisted extraction (MAE) and ultrasound-assisted extraction (UAE). These alternative techniques reduce considerably the use of solvents and accelerate the extraction process.

  15. Transfiguration of extracting mirror in synchrotron radiation system at SSRF

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The first extracting mirror is very important for synchrotron radiation monitor (SRM). The SRM system of SSRF (Shanghai Synchrotron Radiation Facility) should extract the visible light with low optical distortion. The analysis of SR power spectrum and heat transfiguration based on Matlab is introduced in this paper, which will be used in calibration. One beryllium mirror with water-cooling is used to transmit X-ray and reflect visible light to satisfy the measurement request. The existing system suffers from a dynamic problem in some beam physics study. The system includes optics, image acquisition and interferometers. One of the instruments is a digital camera providing the image of the beam transverse profile. The hardware configuration will be summarized. The synchrotron radiation measurement system has been in operation in SSRF for more than one year.

  16. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  17. Health requirements for advanced coal extraction systems

    Science.gov (United States)

    Zimmerman, W. F.

    1980-01-01

    Health requirements were developed as long range goals for future advanced coal extraction systems which would be introduced into the market in the year 2000. The goal of the requirements is that underground coal miners work in an environment that is as close as possible to the working conditions of the general population, that they do not exceed mortality and morbidity rates resulting from lung diseases that are comparable to those of the general population, and that their working conditions comply as closely as possible to those of other industries as specified by OSHA regulations. A brief technique for evaluating whether proposed advanced systems meet these safety requirements is presented, as well as a discussion of the costs of respiratory disability compensation.

  18. Automated Beam Loss Monitoring System At Extraction From U-70

    CERN Document Server

    Afonin, A G; Baranov, V T; Gres, V N; Shaposhnikov, P A; Terekhov, V I; Uglekov, V Ya

    2004-01-01

    This paper presents a description of the new beam loss monitoring system built and comissioned to detect beam losses in the IHEP extraction area at 16 places of interest. It yields information about possible beam interception by extraction elements over each mode of extracting. Multiple measurements with resolution 10ms are possible to study the dynamic processes over the slow extraction. This system is a part of the U-70 Control System.

  19. Distributed Parallel Endmember Extraction of Hyperspectral Data Based on Spark

    Directory of Open Access Journals (Sweden)

    Zebin Wu

    2016-01-01

    Full Text Available Due to the increasing dimensionality and volume of remotely sensed hyperspectral data, the development of acceleration techniques for massive hyperspectral image analysis approaches is a very important challenge. Cloud computing offers many possibilities of distributed processing of hyperspectral datasets. This paper proposes a novel distributed parallel endmember extraction method based on iterative error analysis that utilizes cloud computing principles to efficiently process massive hyperspectral data. The proposed method takes advantage of technologies including MapReduce programming model, Hadoop Distributed File System (HDFS, and Apache Spark to realize distributed parallel implementation for hyperspectral endmember extraction, which significantly accelerates the computation of hyperspectral processing and provides high throughput access to large hyperspectral data. The experimental results, which are obtained by extracting endmembers of hyperspectral datasets on a cloud computing platform built on a cluster, demonstrate the effectiveness and computational efficiency of the proposed method.

  20. RF-knockout Extraction System for the CNAO Synchrotron

    CERN Document Server

    Carmignani, Nicola; Serio, Mario; Balbinot, Giovanni; Bressi, Erminia; Caldara, Michele; Pullia, Marco; Bosser, Jacques; Venchi, Giuseppe

    2010-01-01

    The National Centre for Oncological Hadrontherapy (CNAO) is a centre in Italy for the treatment of patients affected by tumours with proton and carbon ions beams accelerated in a synchrotron. The synchrotron extraction method is based on the use of a betatron core. This work aims to verify, through a theoretical study and a simulation, the possibility of using the RF-knockout extraction method exploiting the existing hardware. A simulation program has been written to simulate the extraction system of the synchrotron with the purpose to define the parameters of the radio frequency. Two types of radio frequencies have been compared in order to obtain a constant spill with the minimum ripple: a carrier wave with a frequency and amplitude modulation, and a gaussian narrow band noise modulated in amplitude. Results of the simulation and considerations on the kicker characteristics are presented

  1. Formulation and in vitro release evaluation of newly synthesized palm kernel oil esters-based nanoemulsion delivery system for 30% ethanolic dried extract derived from local Phyllanthus urinaria for skin antiaging

    Directory of Open Access Journals (Sweden)

    Mahdi ES

    2011-10-01

    Full Text Available Elrashid Saleh Mahdi1, Azmin Mohd Noor1, Mohamed Hameem Sakeena1, Ghassan Z Abdullah1, Muthanna F Abdulkarim1, Munavvar Abdul Sattar2 1Department of Pharmaceutical Technology, 2Department of Physiology, School of Pharmaceutical Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia Background: Recently there has been a remarkable surge of interest about natural products and their applications in the cosmetic industry. Topical delivery of antioxidants from natural sources is one of the approaches used to reverse signs of skin aging. The aim of this research was to develop a nanoemulsion cream for topical delivery of 30% ethanolic extract derived from local Phyllanthus urinaria (P. urinaria for skin antiaging. Methods: Palm kernel oil esters (PKOEs-based nanoemulsions were loaded with P. urinaria extract using a spontaneous method and characterized with respect to particle size, zeta potential, and rheological properties. The release profile of the extract was evaluated using in vitro Franz diffusion cells from an artificial membrane and the antioxidant activity of the extract released was evaluated using the 2, 2-diphenyl-1-picrylhydrazyl (DPPH method. Results: Formulation F12 consisted of wt/wt, 0.05% P. urinaria extract, 1% cetyl alcohol, 0.5% glyceryl monostearate, 12% PKOEs, and 27% Tween® 80/Span® 80 (9/1 with a hydrophilic lipophilic balance of 13.9, and a 59.5% phosphate buffer system at pH 7.4. Formulation F36 was comprised of 0.05% P. urinaria extract, 1% cetyl alcohol, 1% glyceryl monostearate, 14% PKOEs, 28% Tween® 80/Span® 80 (9/1 with a hydrophilic lipophilic balance of 13.9, and 56% phosphate buffer system at pH 7.4 with shear thinning and thixotropy. The droplet size of F12 and F36 was 30.74 nm and 35.71 nm, respectively, and their nanosizes were confirmed by transmission electron microscopy images. Thereafter, 51.30% and 51.02% of the loaded extract was released from F12 and F36 through an artificial cellulose membrane

  2. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources.

    Science.gov (United States)

    Solovyev, Valery; Ivanov, Vladimir

    2016-01-01

    Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts) to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  3. FEATURE EXTRACTION FOR EMG BASED PROSTHESES CONTROL

    Directory of Open Access Journals (Sweden)

    R. Aishwarya

    2013-01-01

    Full Text Available The control of prosthetic limb would be more effective if it is based on Surface Electromyogram (SEMG signals from remnant muscles. The analysis of SEMG signals depend on a number of factors, such as amplitude as well as time- and frequency-domain properties. Time series analysis using Auto Regressive (AR model and Mean frequency which is tolerant to white Gaussian noise are used as feature extraction techniques. EMG Histogram is used as another feature vector that was seen to give more distinct classification. The work was done with SEMG dataset obtained from the NINAPRO DATABASE, a resource for bio robotics community. Eight classes of hand movements hand open, hand close, Wrist extension, Wrist flexion, Pointing index, Ulnar deviation, Thumbs up, Thumb opposite to little finger are taken into consideration and feature vectors are extracted. The feature vectors can be given to an artificial neural network for further classification in controlling the prosthetic arm which is not dealt in this paper.

  4. Mechanism of gold solvent extraction from aurocyanide solution by quaternary amines: models of extracting species based on hydrogen bonding

    Institute of Scientific and Technical Information of China (English)

    马刚; 闫文飞; 陈景; 严纯华; 高宏成; 周维金; 施鼐; 吴谨光; 徐光宪; 黄昆; 余建民; 崔宁

    2000-01-01

    The mechanism of gold solvent extraction from KAu(CN)2 solution was investigated by means of FTIR, EXAFS, ICP and radioactive tracer methods. Two extraction systems were studied, namely N263-tributyl phosphate(TBP)-n-dodecane and N263-iso-octanol-n-dodecane. High-resolution FT IR spectroscopy indicated that the CN stretching vibrations of the two extraction systems differred greatly. In order to interpret the significant difference in CN stretching vibrations, twoextracting species models are proposed——supramolecular structures based on the formation ofhydrogen bonds between Au(CN)2- and modifiers such as TBP and iso-octanol.

  5. Web Template Extraction Based on Hyperlink Analysis

    Directory of Open Access Journals (Sweden)

    Julián Alarte

    2015-01-01

    Full Text Available Web templates are one of the main development resources for website engineers. Templates allow them to increase productivity by plugin content into already formatted and prepared pagelets. For the final user templates are also useful, because they provide uniformity and a common look and feel for all webpages. However, from the point of view of crawlers and indexers, templates are an important problem, because templates usually contain irrelevant information such as advertisements, menus, and banners. Processing and storing this information is likely to lead to a waste of resources (storage space, bandwidth, etc.. It has been measured that templates represent between 40% and 50% of data on the Web. Therefore, identifying templates is essential for indexing tasks. In this work we propose a novel method for automatic template extraction that is based on similarity analysis between the DOM trees of a collection of webpages that are detected using menus information. Our implementation and experiments demonstrate the usefulness of the technique.

  6. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  7. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  8. Analysis of entropy extraction efficiencies in random number generation systems

    Science.gov (United States)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  9. Music snippet extraction via melody-based repeated pattern discovery

    Institute of Scientific and Technical Information of China (English)

    XU JiePing; ZHAO Yang; CHEN Zhe; LIU ZiLi

    2009-01-01

    In this paper, we present a complete set of procedures to automatically extract a music snippet, defined as the most representative or the highlighted excerpt of a music clip. We first generate a modified and compact similarity matrix based on selected features and distance metrics, and then several improved techniques for music repeated pattern discovery are utilized because a music snippet is usually a part of the repeated melody, main theme or chorus. During the process, redundant and wrongly detected patterns are discarded, boundaries are corrected using beat information, and final clusters are also further sorted according to the occurrence frequency and energy information. Subsequently, following our methods, we designed a music snippet extraction system which allows users to detect snippets. Experiments performed on the system show the superiority of our proposed approach.

  10. SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Thiruvengatanadhan Ramalingam

    2014-01-01

    Full Text Available Audio classification serves as the fundamental step towards the rapid growth in audio data volume. Due to the increasing size of the multimedia sources speech and music classification is one of the most important issues for multimedia information retrieval. In this work a speech/music discrimination system is developed which utilizes the Discrete Wavelet Transform (DWT as the acoustic feature. Multi resolution analysis is the most significant statistical way to extract the features from the input signal and in this study, a method is deployed to model the extracted wavelet feature. Support Vector Machines (SVM are based on the principle of structural risk minimization. SVM is applied to classify audio into their classes namely speech and music, by learning from training data. Then the proposed method extends the application of Gaussian Mixture Models (GMM to estimate the probability density function using maximum likelihood decision methods. The system shows significant results with an accuracy of 94.5%.

  11. Feature Extraction based Face Recognition, Gender and Age Classification

    Directory of Open Access Journals (Sweden)

    Venugopal K R

    2010-01-01

    Full Text Available The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are located by using Canny edge operator and face recognition is performed. Based on the texture and shape information gender and age classification is done using Posteriori Class Probability and Artificial Neural Network respectively. It is observed that the face recognition is 100%, the gender and age classification is around 98% and 94% respectively.

  12. Safety evaluation methodology for advanced coal extraction systems

    Science.gov (United States)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  13. Tea extracts antioxidative potential in emulsified lipid systems

    Directory of Open Access Journals (Sweden)

    Anna Gramza-Michałowska

    2008-09-01

    Full Text Available Tea leaves (Camelia sinensis L. extracts are source of polyphenols, i.e. antioxidant components. Research showed possible tea extracts use in food technology, influencing contained lipids stability improvement. The aim of the research was comparison of different teas extracts activity in emulsified lipid system. The present research examined different teas: white, green, yellow, oolong and black aqueous and ethanol extracts. To evaluate the most potent addition level different tea extracts concentrations were chosen. Linoleic acid oxidative stability was measured by linoleic acid conjugated dienes produc-tion monitoring. Emulsions with additives were incubated 19 hours at 37°C in darkness. Results showed different tea extracts antioxidant activity, dependent on its concentration in examined system. Highest antioxidant activity, comparable to BHT and rosemary ex-tract was found in lipid sample with addition of yellow tea ethanol extract.

  14. PCA Fault Feature Extraction in Complex Electric Power Systems

    Directory of Open Access Journals (Sweden)

    ZHANG, J.

    2010-08-01

    Full Text Available Electric power system is one of the most complex artificial systems in the world. The complexity is determined by its characteristics about constitution, configuration, operation, organization, etc. The fault in electric power system cannot be completely avoided. When electric power system operates from normal state to failure or abnormal, its electric quantities (current, voltage and angles, etc. may change significantly. Our researches indicate that the variable with the biggest coefficient in principal component usually corresponds to the fault. Therefore, utilizing real-time measurements of phasor measurement unit, based on principal components analysis technology, we have extracted successfully the distinct features of fault component. Of course, because of the complexity of different types of faults in electric power system, there still exists enormous problems need a close and intensive study.

  15. Web text corpus extraction system for linguistic tasks

    Directory of Open Access Journals (Sweden)

    Héctor Fabio Cadavid Rengifo

    2010-05-01

    Full Text Available Internet content, used as text corpus for natural language learning, offers important characteristics for such task, like its huge vo- lume, being permanently up-to-date with linguistic variants and having low time and resource costs regarding the traditional way that text is built for natural language machine learning tasks. This paper describes a system for the automatic extraction of large bodies of text from the Internet as a valuable tool for such learning tasks. A concurrent programming-based, hardware-use opti- misation strategy significantly improving extraction performance is also presented. The strategies incorporated into the system for maximising hardware resource exploitation, thereby reducing extraction time are presented, as are extendibility (supporting digi- tal-content formats and adaptability (regarding how the system cleanses content for obtaining pure natural language samples. The experimental results obtained after processing one of the biggest Spanish domains on the internet, are presented (i.e. es.wikipedia.org. Such results are used for presenting initial conclusions about the validity and applicability of corpus directly ex- tracted from Internet as morphological or syntactical learning input.

  16. A review on solid phase extraction of actinides and lanthanides with amide based extractants.

    Science.gov (United States)

    Ansari, Seraj A; Mohapatra, Prasanta K

    2017-05-26

    Solid phase extraction is gaining attention from separation scientists due to its high chromatographic utility. Though both grafted and impregnated forms of solid phase extraction resins are popular, the later is easy to make by impregnating a given organic extractant on to an inert solid support. Solid phase extraction on an impregnated support, also known as extraction chromatography, combines the advantages of liquid-liquid extraction and the ion exchange chromatography methods. On the flip side, the impregnated extraction chromatographic resins are less stable against leaching out of the organic extractant from the pores of the support material. Grafted resins, on the other hand, have a higher stability, which allows their prolong use. The goal of this article is a brief literature review on reported actinide and lanthanide separation methods based on solid phase extractants of both the types, i.e., (i) ligand impregnation on the solid support or (ii) ligand functionalized polymers (chemically bonded resins). Though the literature survey reveals an enormous volume of studies on the extraction chromatographic separation of actinides and lanthanides using several extractants, the focus of the present article is limited to the work carried out with amide based ligands, viz. monoamides, diamides and diglycolamides. The emphasis will be on reported applied experimental results rather than on data pertaining fundamental metal complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Analogy between gambling and measurement-based work extraction

    Science.gov (United States)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  18. Kernel-Based Learning for Domain-Specific Relation Extraction

    Science.gov (United States)

    Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo

    In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.

  19. Financial Information Extraction Using Pre-defined and User-definable Templates in the LOLITA System

    OpenAIRE

    Costantino, Marco; Morgan, Richard G.; Collingham, Russell J.

    1996-01-01

    This paper addresses the issue of information extraction in the financial domain within the framework of a large Natural Language Processing system: LOLITA. The LOLITA system, Large-scale Object-based Linguistic Interactor Translator and Analyser, is a general purpose natural language processing system. Different kinds of applications have been built around the system's core. One of these is the financial information extraction application, which has been designed in close contact with expert...

  20. CEMS using hot wet extractive method based on DOAS

    Science.gov (United States)

    Sun, Bo; Zhang, Chi; Sun, Changku

    2011-11-01

    A continuous emission monitoring system (CEMS) using hot wet extractive method based on differential optical absorption spectroscopy (DOAS) is designed. The developed system is applied to retrieving the concentration of SO2 and NOx in flue gas on-site. The flue gas is carried along a heated sample line into the sample pool at a constant temperature above the dew point. In this case, the adverse impact of water vapor on measurement accuracy is reduced greatly, and the on-line calibration is implemented. And then the flue gas is discharged from the sample pool after the measuring process is complete. The on-site applicability of the system is enhanced by using Programmable Logic Controller (PLC) to control each valve in the system during the measuring and on-line calibration process. The concentration retrieving method used in the system is based on the partial least squares (PLS) regression nonlinear method. The relationship between the known concentration and the differential absorption feature gathered by the PLS nonlinear method can be figured out after the on-line calibration process. Then the concentration measurement of SO2 and NOx can be easily implemented according to the definite relationship. The concentration retrieving method can identify the information and noise effectively, which improves the measuring accuracy of the system. SO2 with four different concentrations are measured by the system under laboratory conditions. The results proved that the full-scale error of this system is less than 2%FS.

  1. Simulation of ion beam extraction and focusing system

    Institute of Scientific and Technical Information of China (English)

    B.A.Soliman; M.M.Abdelrahman; A.G.Helal; F.W.Abdelsalam

    2011-01-01

    The characteristics of ion beam extraction and focused to a volume as small as possible were investigated with the aid of computer code SIMION 3D version 7.This has been used to evaluate the extraction characteristics(accel-decel system)to generate an ion beam with low beam emittance and high brightness.The simulation process can provide a good study for optimizing the extraction and focusing system of the ion beam without any losses and transported to the required target.Also,a study of a simulation model for the extraction system of the ion source was used to describe the possible plasma boundary curvatures during the ion extraction that may be affected by the change in an extraction potential with a constant plasma density meniscus.

  2. DNA extraction of birch leaves by improved CTAB method and optimization of its ISSR system

    Institute of Scientific and Technical Information of China (English)

    PAN hua; YANG Chuan-ping; WEI Zhi-gang; JIANG Jing

    2006-01-01

    The basic method of DNA extraction (CTAB) was improved as the multi-times STE-CTAB extraction method and used to extract the DNA of birch leaved in this experiment. Results showed that the improved method is suitable not only for genomic DNA extraction of birch but also for that of other plants. The purity of genomic DNA extracted by the multi-times STE-CTAB extraction method is higher than that by one time STE-CTAB method, and it does not need the process of RNase. The factors of influencing ISSR system were explored based on the genomic DNA of birch extracted by the two methods. The optimal conditions for ISSR system were determined as follows: cycles of denaturation for 30 s at 94℃, annealing for 30 s at 51 ℃, extension for 30 s at 72℃, and a final 7 min extension at 72 ℃.

  3. Audio enabled information extraction system for cricket and hockey domains

    CERN Document Server

    Saraswathi, S; B., Sai Vamsi Krishna; S, Suresh Reddy

    2010-01-01

    The proposed system aims at the retrieval of the summarized information from the documents collected from web based search engine as per the user query related to cricket and hockey domain. The system is designed in a manner that it takes the voice commands as keywords for search. The parts of speech in the query are extracted using the natural language extractor for English. Based on the keywords the search is categorized into 2 types: - 1.Concept wise - information retrieved to the query is retrieved based on the keywords and the concept words related to it. The retrieved information is summarized using the probabilistic approach and weighted means algorithm.2.Keyword search - extracts the result relevant to the query from the highly ranked document retrieved from the search by the search engine. The relevant search results are retrieved and then keywords are used for summarizing part. During summarization it follows the weighted and probabilistic approaches in order to identify the data comparable to the k...

  4. Comparison of Extraction and Non-extraction Orthodontic Treatment using the Objective Grading System

    OpenAIRE

    N. Farhadian; AF. Miresmaeili; MK. Soltani

    2005-01-01

    Statement of Problem: The extraction versus non-extraction debate is almost as old as the advent of orthodontic practice and up to now, this dilemma remains. Recently, the American Board of Orthodontics (ABO) has developed a method by the name of Objective Grading System (OGS) in order to evaluate the results of orthodontic treatment.Aim: The aim of the present study was to evaluate and compare the patients’ final occlusion after extraction and non-extraction therapy using the OGS.Materials a...

  5. Object Extraction Based on Evolutionary Morphological Processing

    Institute of Scientific and Technical Information of China (English)

    LI Bin; PAN Li

    2004-01-01

    This paper introduces a novel technique for object detection using genetic algorithms and morphological processing. The method employs a kind of object oriented structure element, which is derived by genetic algorithms. The population of morphological filters is iteratively evaluated according to a statistical performance index corresponding to object extraction ability, and evolves into an optimal structuring element using the evolution principles of genetic search. Experimental results of road extraction from high resolution satellite images are presented to illustrate the merit and feasibility of the proposed method.

  6. HIGH POWER FAST KICKER SYSTEM FOR SNS BEAM EXTRACTION.

    Energy Technology Data Exchange (ETDEWEB)

    ZHANG,W.; SANDBERG,J.; TSOUPAS,N.; MI,J.; LAMBIASE,R.; LOCKEY,R.; PAI,C.; TUOZZOLO,J.; NEHRING,T.; WARBURTON,D.

    2002-06-30

    A Blumlein topology based high peak power, high repetition rate, and low beam impedance fast extraction kicker system for ORNL Spallation Neutron Source (SNS) is being developed at Brookhaven National Laboratory. The large magnet window size, large deflecting angle, low beam impedance termination and fast deflecting field rise time demand a very strong pulsed power source to drive the SNS extraction fast kicker magnet. This system consists of fourteen high voltage modulators and fourteen lumped kicker magnet sections. All modulators will be located in a service building outside the beam tunnel, which is a revised design requirement adopted in the mid 2000. The high current pulses generated by the high power modulators will be delivered through high voltage pulsed transmission cables to each kicker magnet sections. The designed output capacity of this system, is in multiple GVA. Its first article modulator has been constructed and is being tested. In this paper, we present the system overview, project status and the advantages of this new conceptual design.

  7. Modeling of the thyme: Liquid carbon dioxide extraction system

    Directory of Open Access Journals (Sweden)

    Zeković Zoran P.

    2003-01-01

    Full Text Available The extraction of thyme (Thymus vulgaris L by liquid carbon dioxide was investigated. The obtained extracts were analyzed by HPLC and GC-MS methods, and their composition was compared with that of the essential oil obtained by steam distillation and by supercritical carbon dioxide at 100 bar and 40°C. To model extraction of the thyme - liquid carbon dioxide system, we used the Reverchon - Sesti Osseo equation, as well as our modified equation.

  8. Comparison of Extraction and Non-extraction Orthodontic Treatment using the Objective Grading System

    Directory of Open Access Journals (Sweden)

    N. Farhadian

    2005-09-01

    Full Text Available Statement of Problem: The extraction versus non-extraction debate is almost as old as the advent of orthodontic practice and up to now, this dilemma remains. Recently, the American Board of Orthodontics (ABO has developed a method by the name of Objective Grading System (OGS in order to evaluate the results of orthodontic treatment.Aim: The aim of the present study was to evaluate and compare the patients’ final occlusion after extraction and non-extraction therapy using the OGS.Materials and Methods: Sixty sex-matched cases with an age range of 15-20 year old were selected and evenly divided into 2 groups as follows: 30 patients were treated by extraction of 4 premolars and 30 received a non-extraction treatment. All patients had class 1 malocclusion before treatment and were well treated with the standard edgewisesystem in a private clinic. With the aid of an ABO measuring gauge, 8 parameters of occlusion were measured 3 times,each. Reproducibility of the measurements were evaluated by use of the Phi correlation coefficient and the total OGS scores between thetwo groups were compared using Levene`s test and Student t- test with the significant level at 95%.Results: The mean OSG scores were significantly more negative in the non–extraction group (-6.58 ± 8.63 as compared to the extraction group (-28.65 ± 6.67, p < 0.004.Acceptable occlusion was observed in 73.4% of the extraction and 43.4% of the nonextraction cases.Conclusion: In this study according to the ABO grading system (OGS, the final occlusion of patients treated with extraction seemed more acceptable than non-extracted cases.

  9. Rough Set Theory Based Approach for Fault Diagnosis Rule Extraction of Distribution System%基于粗糙集理论的配电网故障诊断规则提取方法

    Institute of Scientific and Technical Information of China (English)

    周永勇; 周湶; 刘佳宾

    2008-01-01

    As the first step of service restoration of distribution system, rapid fault diagnosis is a significant task for reducing power outage time, decreasing outage loss, and subsequently improving service reliability and safety. This paper analyzes a fault diagnosis approach by using rough set theory in which how to reduce decision table of data set is a main calculation intensive task. Aiming at this reduction problem, a heuristic reduction algorithm based on attribution length and frequency is proposed. At the same time, the corresponding value reduction method is proposed in order to fulfill the reduction and diagnosis rules extraction. Meanwhile, a Euclid matching method is introduced to solve confliction problems among the extracted rules when some information is lacking. Principal of the whole algorithm is clear and diagnostic rules distilled from the reduction are concise. Moreover, it needs less calculation towards specific discernibility matrix, and thus avoids the corresponding NP hard problem. The whole process is realized by MATLAB programming. A simulation example shows that the method has a fast calculation speed, and the extracted rules can reflect the characteristic of fault with a concise form. The rule database, formed by different reduction of decision table, can diagnose single fault and multi-faults efficiently, and give satisfied results even when the existed information is incomplete. The proposed method has good error-tolerate capability and the potential for on-line fault diagnosis.

  10. An integrated one-step system to extract, analyze and annotate all relevant information from image-based cell screening of chemical libraries.

    Science.gov (United States)

    Rabal, Obdulia; Link, Wolfgang; Serelde, Beatriz G; Bischoff, James R; Oyarzabal, Julen

    2010-04-01

    Here we report the development and validation of a complete solution to manage and analyze the data produced by image-based phenotypic screening campaigns of small-molecule libraries. In one step initial crude images are analyzed for multiple cytological features, statistical analysis is performed and molecules that produce the desired phenotypic profile are identified. A naïve Bayes classifier, integrating chemical and phenotypic spaces, is built and utilized during the process to assess those images initially classified as "fuzzy"-an automated iterative feedback tuning. Simultaneously, all this information is directly annotated in a relational database containing the chemical data. This novel fully automated method was validated by conducting a re-analysis of results from a high-content screening campaign involving 33 992 molecules used to identify inhibitors of the PI3K/Akt signaling pathway. Ninety-two percent of confirmed hits identified by the conventional multistep analysis method were identified using this integrated one-step system as well as 40 new hits, 14.9% of the total, originally false negatives. Ninety-six percent of true negatives were properly recognized too. A web-based access to the database, with customizable data retrieval and visualization tools, facilitates the posterior analysis of annotated cytological features which allows identification of additional phenotypic profiles; thus, further analysis of original crude images is not required.

  11. New Beam Profile Monitoring System At Extraction From U-70

    CERN Document Server

    Afonin, A G; Gorlov, V N; Gres, V N; Sokolov, S V; Terekhov, V I

    2004-01-01

    In the course of upgrading Instrumentation of extracted beams at IHEP the new beam profile monitoring system has been developed and comissioned. It incorporates many improvements towards lower amount of monitor substance to be placed in the beam path, lower cable expenses, higher radiation and noisy resistance, wider dynamic range. New circuitry allows one to measure profiles separately for each extraction in one machine cycle. The system has a capability for multiple measures of beam profile during the slow extraction for exploring dynamic effects. The paper describes hardware and software of system.

  12. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  13. Calculation Of Extraction Optics For Ion System With Plazma Emitter

    CERN Document Server

    Frolov, B A

    2004-01-01

    The 2-D code for simulating of ion optics system of positive ion extraction from a plasma source is described. Example calculation of 100 kV optics for the extraction ion IHEP gun is presented. The trajectories of particles and emittance plots are resulted. The aberrations influ-ence strongly on ion optics for considered geometry.

  14. Road marking features extraction using the VIAPIX® system

    Science.gov (United States)

    Kaddah, W.; Ouerhani, Y.; Alfalou, A.; Desthieux, M.; Brosseau, C.; Gutierrez, C.

    2016-07-01

    Precise extraction of road marking features is a critical task for autonomous urban driving, augmented driver assistance, and robotics technologies. In this study, we consider an autonomous system allowing us lane detection for marked urban roads and analysis of their features. The task is to relate the georeferencing of road markings from images obtained using the VIAPIX® system. Based on inverse perspective mapping and color segmentation to detect all white objects existing on this road, the present algorithm enables us to examine these images automatically and rapidly and also to get information on road marks, their surface conditions, and their georeferencing. This algorithm allows detecting all road markings and identifying some of them by making use of a phase-only correlation filter (POF). We illustrate this algorithm and its robustness by applying it to a variety of relevant scenarios.

  15. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国

    2001-01-01

    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  16. LHC beam dumping system Extraction channel layout and acceptance

    CERN Document Server

    Goddard, B; Uythoven, J; Veness, R; Weterings, W

    2003-01-01

    The LHC beam dumping system must safely abort the LHC beams under all conditions, including those resulting from abnormal behaviour of machine elements or subsystems of the beam dumping system itself. The extraction channels must provide sufficient aperture both for the circulating and extracted beams, over the whole energy range and under various beam parameters. These requirements impose tight constraints on the tolerances of various extraction channel components, and also on the allowed range of beam positions in the region of these components. Operation of the beam dumping system under various fault states has been considered, and the resulting apertures calculated. After describing briefly the beam dumping system and the extraction channel geometry, the various assumptions made in the analysis are presented, before deriving tolerance limits for the relevant equipment and beam parameters.

  17. Interrogating Bronchoalveolar Lavage Samples via Exclusion-Based Analyte Extraction.

    Science.gov (United States)

    Tokar, Jacob J; Warrick, Jay W; Guckenberger, David J; Sperger, Jamie M; Lang, Joshua M; Ferguson, J Scott; Beebe, David J

    2017-06-01

    Although average survival rates for lung cancer have improved, earlier and better diagnosis remains a priority. One promising approach to assisting earlier and safer diagnosis of lung lesions is bronchoalveolar lavage (BAL), which provides a sample of lung tissue as well as proteins and immune cells from the vicinity of the lesion, yet diagnostic sensitivity remains a challenge. Reproducible isolation of lung epithelia and multianalyte extraction have the potential to improve diagnostic sensitivity and provide new information for developing personalized therapeutic approaches. We present the use of a recently developed exclusion-based, solid-phase-extraction technique called SLIDE (Sliding Lid for Immobilized Droplet Extraction) to facilitate analysis of BAL samples. We developed a SLIDE protocol for lung epithelial cell extraction and biomarker staining of patient BALs, testing both EpCAM and Trop2 as capture antigens. We characterized captured cells using TTF1 and p40 as immunostaining biomarkers of adenocarcinoma and squamous cell carcinoma, respectively. We achieved up to 90% (EpCAM) and 84% (Trop2) extraction efficiency of representative tumor cell lines. We then used the platform to process two patient BAL samples in parallel within the same sample plate to demonstrate feasibility and observed that Trop2-based extraction potentially extracts more target cells than EpCAM-based extraction.

  18. Central Nervous System Effects of Ginkgo Biloba, a Plant Extract.

    Science.gov (United States)

    Itil, Turan M.; Eralp, Emin; Tsambis, Elias; Itil, Kurt Z.; Stein, Ulrich

    1996-01-01

    Extracts of Ginkgo biloba (EGb) are among the most prescribed drugs in France and Germany. EGb is claimed to be effective in peripheral arterial disorders and in "cerebral insufficiency." The mechanism of action is not yet well understood. Three of the ingredients of the extract have been isolated and found to be pharmacologically active, but which one alone or in combination is responsible for clinical effects is unknown. The recommended daily dose (3 x 40 mg extract) is based more on empirical data than on clinical dose-findings studies. However, despite these, according to double-blind, placebo-controlled clinical trials, EGb has therapeutic effects, at least, on the diagnostic entity of "cerebral insufficiency," which is used in Europe as synonymous with early dementia. To determine whether EGb has significant pharmacological effects on the human brain, a pharmacodynamic study was conducted using the Quantitative Pharmacoelectroencephalogram (QPEEG(R)) method. It was established that the pharmacological effects (based on a predetermined 7.5--13.0-Hz alpha frequency band in a computer-analyzed electroencephalogram = CEEG(R)) of EGb on the central nervous system (CNS) are significantly different than placebo, and the high and low doses could be discriminated from each other. The 120-mg, but particularly the 240-mg, single doses showed the most consistent CNS effects with an earlier onset (1 h) and longer duration (7 h). Furthermore, it was established that the electrophysiological effects of EGb in CNS are similar to those of well-known cognitive activators such as "nootropics" as well as tacrine, the only marketed "antidementia" drug currently available in the United States.

  19. Knowledge-based Approach for Event Extraction from Arabic Tweets

    Directory of Open Access Journals (Sweden)

    Mohammad AL-Smadi

    2016-06-01

    Full Text Available Tweets provide a continuous update on current events. However, Tweets are short, personalized and noisy, thus raises more challenges for event extraction and representation. Extracting events out of Arabic tweets is a new research domain where few examples – if any – of previous work can be found. This paper describes a knowledge-based approach for fostering event extraction out of Arabic tweets. The approach uses an unsupervised rule-based technique for event extraction and provides a named entity disambiguation of event related entities (i.e. person, organization, and location. Extracted events and their related entities are populated to the event knowledge base where tagged tweets’ entities are linked to their corresponding entities represented in the knowledge base. Proposed approach was evaluated on a dataset of 1K Arabic tweets covering different types of events (i.e. instant events and interval events. Results show that the approach has an accuracy of, 75.9% for event trigger extraction, 87.5% for event time extraction, and 97.7% for event type identification.

  20. Extraction of Textual Causal Relationships based on Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Sepideh Jamshidi-Nejad

    2015-11-01

    Full Text Available Natural language processing is a highly important subcategory in the wide area of artificial intelligence. Employing appropriate computational algorithms on sophisticated linguistic operations is the aim of natural language processing to extract and create computational theories from languages. In order to achieve this goal, the knowledge of linguists is needed in addition to computer science. In the field of linguistics, the syntactic and semantic relation of words and phrases and the extraction of causation is very significant which the latter is an information retrieval challenge. Recently, there is an increased attention towards the automatic extraction of causation from textual data sets. Although, previous research extracted the casual relations from uninterrupted data sets by using knowledge-based inference technologies and manual coding. Recently, finding comprehensive approaches for detection and extractions of causal arguments is a research area in the field of natural language processing.In this paper, a three-stepped approach is established through which, the position of words with syntax trees is obtained by extracting causation from causal and non-causal sentences of Web text. The arguments of events were extracted according to the dependency tree of phrases implemented by Python packages. Then potential causal relations were extracted by the extraction of specific nodes of the tree. In the final step, a statistical model is introduced for measuring the potential causal relations. Experimental results and evaluations with Recall, Precision and F-measure metrics show the accuracy and efficiency of the suggested model.

  1. A Bayesian approach to extracting meaning from system behavior

    Energy Technology Data Exchange (ETDEWEB)

    Dress, W.B.

    1998-08-01

    The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that

  2. Antioxidant Activity of Flaxseed Extracts in Lipid Systems

    Directory of Open Access Journals (Sweden)

    Adriana Slavova-Kazakova

    2015-12-01

    Full Text Available The aim of this work was to compare the antioxidant activity of the extract of flaxseed and its alkaline hydrolysate in two model systems: lipid autoxidation of triacylglycerols of sunflower oil (TGSO—in a homogeneous lipid media and during β-carotene-linoleate emulsion system. In addition, pure lignans were tested. The material was defatted with hexane and then phenolic compounds were extracted using dioxane-ethanol (50:50, v/v mixture. Carbohydrates were removed from the crude extract using an Amberlite XAD-16 column chromatography. The content of total phenolic compounds in the crude extract and after alkaline hydrolysis was determined using a Folin-Ciocalteu’s phenol reagent. Individual phenolic compounds were determined by nordihydroguaiaretic acid (RP-HPLC method in gradient system. The alkaline hydrolysis increased the content of total phenolics in the extract approximately by 10%. In the extracts of flaxseed, phenolic compounds were present in the form of macromolecular complex. In the alkaline hydrolysate, secoisolariciresinol diglucoside (SDG was found as the main phenolic compound. Small amounts of p-coumaric and ferulic acids were also determined. SDG and both extracts were not able to inhibit effectively lipid autoxidation. The kinetics of TGSO autoxidation at 80 °C in absence and in presence of the extract before hydrolysis (EBH and after hydrolysis (EAH was monitored and compared with known standard antioxidants. Ferulic acid (FA and butylated hydroxyl toluene (BHT showed much higher antioxidant efficiency and reactivity than that of both extracts. Secoisolariciresinol (SECO showed a higher activity in both model systems than SDG. However, the activity of SECO was much lower than that of nordihydroquaiaretic acid (NDGA.

  3. Web entity extraction based on entity attribute classification

    Science.gov (United States)

    Li, Chuan-Xi; Chen, Peng; Wang, Ru-Jing; Su, Ya-Ru

    2011-12-01

    The large amount of entity data are continuously published on web pages. Extracting these entities automatically for further application is very significant. Rule-based entity extraction method yields promising result, however, it is labor-intensive and hard to be scalable. The paper proposes a web entity extraction method based on entity attribute classification, which can avoid manual annotation of samples. First, web pages are segmented into different blocks by algorithm Vision-based Page Segmentation (VIPS), and a binary classifier LibSVM is trained to retrieve the candidate blocks which contain the entity contents. Second, the candidate blocks are partitioned into candidate items, and the classifiers using LibSVM are performed for the attributes annotation of the items and then the annotation results are aggregated into an entity. Results show that the proposed method performs well to extract agricultural supply and demand entities from web pages.

  4. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  5. Performance Analysis of Vision-Based Deep Web Data Extraction for Web Document Clustering

    Directory of Open Access Journals (Sweden)

    M. Lavanya

    2013-01-01

    Full Text Available Web Data Extraction is a critical task by applying various scientific tools and in a broad range of application domains. To extract data from multiple web sites are becoming more obscure, as well to design of web information extraction systems becomes more complex and time-consuming. We also present in this paper so far various risks in web data extraction. Identifying data region from web is a noteworthy crisis for information extraction from the web page. In this paper, performance of vision-based deep web data extraction for web document clustering is presented with experimental result. The proposed approach comprises of two phases: 1 Vision-based web data extraction, where output of phase I is given to second phase and 2 web document clustering. In phase 1, the web page information is segmented into various chunks. From which, surplus noise and duplicate chunks are removed using three parameters, such as hyperlink percentage, noise score and cosine similarity. To identify the relevant chunk, three parameters such as Title word Relevancy, Keyword frequency-based chunk selection, Position features are used and then, a set of keywords are extracted from those main chunks. Finally, the extracted keywords are subjected to web document clustering using Fuzzy c-means clustering (FCM. The experimentation has been performed on two different datasets and the results showed that the proposed VDEC method can achieve stable and good results of about 99.2% and 99.1% precision value in both datasets.

  6. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    Energy Technology Data Exchange (ETDEWEB)

    Walworth, Matthew J [ORNL; ElNaggar, Mariam S [ORNL; Stankovich, Joseph J [ORNL; WitkowskiII, Charles E. [Protein Discovery, Inc.; Norris, Jeremy L [ORNL; Van Berkel, Gary J [ORNL

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  7. Semantic information extracting system for classification of radiological reports in radiology information system (RIS)

    Science.gov (United States)

    Shi, Liehang; Ling, Tonghui; Zhang, Jianguo

    2016-03-01

    Radiologists currently use a variety of terminologies and standards in most hospitals in China, and even there are multiple terminologies being used for different sections in one department. In this presentation, we introduce a medical semantic comprehension system (MedSCS) to extract semantic information about clinical findings and conclusion from free text radiology reports so that the reports can be classified correctly based on medical terms indexing standards such as Radlex or SONMED-CT. Our system (MedSCS) is based on both rule-based methods and statistics-based methods which improve the performance and the scalability of MedSCS. In order to evaluate the over all of the system and measure the accuracy of the outcomes, we developed computation methods to calculate the parameters of precision rate, recall rate, F-score and exact confidence interval.

  8. Extractive text summarization system to aid data extraction from full text in systematic review development.

    Science.gov (United States)

    Bui, Duy Duc An; Del Fiol, Guilherme; Hurdle, John F; Jonnalagadda, Siddhartha

    2016-12-01

    Extracting data from publication reports is a standard process in systematic review (SR) development. However, the data extraction process still relies too much on manual effort which is slow, costly, and subject to human error. In this study, we developed a text summarization system aimed at enhancing productivity and reducing errors in the traditional data extraction process. We developed a computer system that used machine learning and natural language processing approaches to automatically generate summaries of full-text scientific publications. The summaries at the sentence and fragment levels were evaluated in finding common clinical SR data elements such as sample size, group size, and PICO values. We compared the computer-generated summaries with human written summaries (title and abstract) in terms of the presence of necessary information for the data extraction as presented in the Cochrane review's study characteristics tables. At the sentence level, the computer-generated summaries covered more information than humans do for systematic reviews (recall 91.2% vs. 83.8%, plearning and natural language processing are promising approaches to the development of such an extractive summarization system. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Novel thiosalicylate-based ionic liquids for heavy metal extractions

    Energy Technology Data Exchange (ETDEWEB)

    Leyma, Raphlin; Platzer, Sonja [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Jirsa, Franz [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Department of Zoology, University of Johannesburg, PO Box 524, Auckland Park, 2006, Johannesburg (South Africa); Kandioller, Wolfgang, E-mail: wolfgang.kandioller@univie.ac.at [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria); Krachler, Regina; Keppler, Bernhard K. [Institute of Inorganic Chemistry, Faculty of Chemistry, University of Vienna, Waehringer Str. 42, A-1090 Vienna (Austria)

    2016-08-15

    Highlights: • Six thiosalicylate-based ammonium and phosphonium ionic liquids (ILs) were newly synthesized. • ILs showed good extraction of cadmium, copper, and zinc. • Phosphonium ILs showed better extraction efficiencies than their ammonium counterparts. - Abstract: This study aims to develop novel ammonium and phosphonium ionic liquids (ILs) with thiosalicylate (TS) derivatives as anions and evaluate their extracting efficiencies towards heavy metals in aqueous solutions. Six ILs were synthesized, characterized, and investigated for their extracting efficacies for cadmium, copper, and zinc. Liquid-liquid extractions of Cu, Zn, or Cd with ILs after 1–24 h using model solutions (pH 7; 0.1 M CaCl{sub 2}) were assessed using flame atomic absorption spectroscopy (F-AAS). Phosphonium-based ILs trihexyltetradecylphosphonium 2-(propylthio)benzoate [P{sub 66614}][PTB] and 2-(benzylthio)benzoate [P{sub 66614}][BTB] showed best extraction efficiency for copper and cadmium, respectively and zinc was extracted to a high degree by [P{sub 66614}][BTB] exclusively.

  10. Efficient sparse kernel feature extraction based on partial least squares.

    Science.gov (United States)

    Dhanjal, Charanpal; Gunn, Steve R; Shawe-Taylor, John

    2009-08-01

    The presence of irrelevant features in training data is a significant obstacle for many machine learning tasks. One approach to this problem is to extract appropriate features and, often, one selects a feature extraction method based on the inference algorithm. Here, we formalize a general framework for feature extraction, based on Partial Least Squares, in which one can select a user-defined criterion to compute projection directions. The framework draws together a number of existing results and provides additional insights into several popular feature extraction methods. Two new sparse kernel feature extraction methods are derived under the framework, called Sparse Maximal Alignment (SMA) and Sparse Maximal Covariance (SMC), respectively. Key advantages of these approaches include simple implementation and a training time which scales linearly in the number of examples. Furthermore, one can project a new test example using only k kernel evaluations, where k is the output dimensionality. Computational results on several real-world data sets show that SMA and SMC extract features which are as predictive as those found using other popular feature extraction methods. Additionally, on large text retrieval and face detection data sets, they produce features which match the performance of the original ones in conjunction with a Support Vector Machine.

  11. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources

    Directory of Open Access Journals (Sweden)

    Valery Solovyev

    2016-01-01

    Full Text Available Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  12. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions.

  13. Perceptual Object Extraction Based on Saliency and Clustering

    Directory of Open Access Journals (Sweden)

    Qiaorong Zhang

    2010-08-01

    Full Text Available Object-based visual attention has received an increasing interest in recent years. Perceptual object is the basic attention unit of object-based visual attention. The definition and extraction of perceptual objects is one of the key technologies in object-based visual attention computation model. A novel perceptual object definition and extraction method is proposed in this paper. Based on Gestalt theory and visual feature integration theory, perceptual object is defined using homogeneity region, salient region and edges. An improved saliency map generating algorithm is employed first. Based on the saliency map, salient edges are extracted. Then graph-based clustering algorithm is introduced to get homogeneity regions in the image. Finally an integration strategy is adopted to combine salient edges and homogeneity regions to extract perceptual objects. The proposed perceptual object extraction method has been tested on lots of natural images. Experiment results and analysis are presented in this paper also. Experiment results show that the proposed method is reasonable and valid.

  14. Moving Target Information Extraction Based on Single Satellite Image

    Directory of Open Access Journals (Sweden)

    ZHAO Shihu

    2015-03-01

    Full Text Available The spatial and time variant effects in high resolution satellite push broom imaging are analyzed. A spatial and time variant imaging model is established. A moving target information extraction method is proposed based on a single satellite remote sensing image. The experiment computes two airplanes' flying speed using ZY-3 multispectral image and proves the validity of spatial and time variant model and moving information extracting method.

  15. Background Knowledge in Learning-Based Relation Extraction

    Science.gov (United States)

    Do, Quang Xuan

    2012-01-01

    In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…

  16. Background Knowledge in Learning-Based Relation Extraction

    Science.gov (United States)

    Do, Quang Xuan

    2012-01-01

    In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…

  17. THERMODYNAMIC MODEL OF CAPITAL EXTRACTION IN ECONOMIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. A. Ahremenkov

    2005-10-01

    Full Text Available In this paper the properties of the wealth function of an economic system are studied. An economic analog of the Gibbs-Duhem equation is derived. Equilibrium states and limiting profit extraction regimes in non-equilibrium economic systems are obtained for the Cobb-Douglas wealth function.

  18. A micro hot test of the Chalmers-GANEX extraction system on used nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    Bauhn, L.; Hedberg, M.; Aneheim, E.; Ekberg, C.; Loefstroem-Engdahl, E.; Skarnemark, G. [Department of Chemical and Biological Engineering, Nuclear Chemistry, Chalmers University of Technology, Kemivaegen 4, SE-412 96 Goeteborg (Sweden)

    2013-07-01

    In the present study, a 'micro hot test' has been performed using the Chalmers-GANEX (Group Actinide Extraction) system for partitioning of used nuclear fuel. The test included a pre-extraction step using N,N-di-2- ethylhexyl-butyramide (DEHBA) in n-octanol to remove the bulk part of the uranium. This pre-extraction was followed by a group extraction of actinides using the mixture of TBP and CyMe{sub 4}-BTBP in cyclohexanone as suggested in the Chalmers-GANEX process, and a three stage stripping of the extracted actinides. Distribution ratios for the extractions and stripping were determined based on a combination of γ- and α-spectrometry, as well as ICP-MS measurements. Successful extraction of uranium, plutonium and the minor actinides neptunium, americium and curium was achieved. However, measurements also indicated that co-extraction of europium occurs to some extent during the separation. These results were expected based on previous experiments using trace concentrations of actinides and lanthanides. Since this test was only performed in one stage with respect to the group actinide extraction, it is expected that multi stage tests will give even better results. (authors)

  19. Knowledge Extraction based on Evolutionary Learning (KEEL

    Directory of Open Access Journals (Sweden)

    Manju Narwal

    2012-07-01

    Full Text Available The purpose of this paper is to describe some basic concepts about keel including evolutionary learning , keel structure with dataset and analysis of genetic fuzzy system. The aim of this section is to present the KEEL framework, describing some guidelines to help a potential developer to build new methods inside the KEEL environment. The next sections will deal with the formats of the configuration files of KEEL which include data sets files, method descriptions and so on. And , the last section describes the API dataset of KEEL, which is used to handle and check the dataset files .

  20. GLODAPv2 data exploration and extraction system

    Science.gov (United States)

    Krassovski, Misha; Kozyr, Alex; Boden, Thomas

    2016-04-01

    The Global Ocean Data Analysis Project (GLODAP) is a cooperative effort of investigators funded for ocean synthesis and modeling projects by the U.S. National Oceanic and Atmospheric Administration (NOAA), Department of Energy (DOE), and National Science Foundation (NSF). Cruises conducted as part of the WOCE, JGOFS, and NOAA Ocean-Atmosphere Carbon Exchange Study (OACES) over the decade of the 1990s generated oceanographic data of unparalleled quality and quantity. GLODAPv2 is a uniformly calibrated open-ocean data product containing inorganic carbon and carbon-relevant variables. This new product includes data from approximately one million individual seawater samples collected from over 700 cruises during the period 1972-2013. Extensive quality control and subsequent calibration were carried out for salinity, oxygen, nutrient, carbon dioxide, total alkalinity, pH, and chlorofluorocarbon data. The Carbon Dioxide Information and Analysis Center (CDIAC), serving as the primary DOE disseminator for climate data and information, developed database and web accessible systems that permit users worldwide to query and retrieve data from the GLODAPv2 collection. This presentation will showcase this new system, discuss technologies used to build the GLODAPv2 resource, and describe integration with a metadata search engine provided by CDIAC as well.

  1. Design of guanidinium ionic liquid based microwave-assisted extraction for the efficient extraction of Praeruptorin A from Radix peucedani.

    Science.gov (United States)

    Ding, Xueqin; Li, Li; Wang, Yuzhi; Chen, Jing; Huang, Yanhua; Xu, Kaijia

    2014-12-01

    A series of novel tetramethylguanidinium ionic liquids and hexaalkylguanidinium ionic liquids have been synthesized based on 1,1,3,3-tetramethylguanidine. The structures of the ionic liquids were confirmed by (1)H NMR spectroscopy and mass spectrometry. A green guanidinium ionic liquid based microwave-assisted extraction method has been developed with these guanidinium ionic liquids for the effective extraction of Praeruptorin A from Radix peucedani. After extraction, reversed-phase high-performance liquid chromatography with UV detection was employed for the analysis of Praeruptorin A. Several significant operating parameters were systematically optimized by single-factor and L9 (3(4)) orthogonal array experiments. The amount of Praeruptorin A extracted by [1,1,3,3-tetramethylguanidine]CH2CH(OH)COOH is the highest, reaching 11.05 ± 0.13 mg/g. Guanidinium ionic liquid based microwave-assisted extraction presents unique advantages in Praeruptorin A extraction compared with guanidinium ionic liquid based maceration extraction, guanidinium ionic liquid based heat reflux extraction and guanidinium ionic liquid based ultrasound-assisted extraction. The precision, stability, and repeatability of the process were investigated. The mechanisms of guanidinium ionic liquid based microwave-assisted extraction were researched by scanning electron microscopy and IR spectroscopy. All the results show that guanidinium ionic liquid based microwave-assisted extraction has a huge potential in the extraction of bioactive compounds from complex samples.

  2. Level Sets and Voronoi based Feature Extraction from any Imagery

    DEFF Research Database (Denmark)

    Sharma, O.; Anton, François; Mioc, Darka

    2012-01-01

    Polygon features are of interest in many GEOProcessing applications like shoreline mapping, boundary delineation, change detection, etc. This paper presents a unique new GPU-based methodology to automate feature extraction combining level sets, or mean shift based segmentation together with Voronoi...

  3. Surgical removal of infected pacemaker leads without cardiopulmonary bypass after failed extraction using the Excimer Laser Sheath Extraction System.

    Science.gov (United States)

    Tokunaga, Chiho; Enomoto, Yoshiharu; Sato, Fujio; Kanemoto, Shinya; Matsushita, Shonosuke; Hiramatsu, Yuji; Aonuma, Kazutaka; Sakakibara, Yuzuru

    2012-03-01

    With the growing number of cardiac pacemakers and internal cardioverter defibrillator implantations, problems with endocardial lead infection have been increasing. The newly developed Excimer Laser Sheath Lead Extraction System has been recognized as being highly useful for removing chronic infected leads. However, serious bleeding complications are a concern when this system is used. Here we report our experience with a 67-year-old man who was diagnosed with pacemaker endocarditis. Initially, lead removal was attempted using the Excimer Laser Sheath Extraction System, though this was abandoned because of severe adhesion of the leads and the junction of the supra vena cava (SVC) with the right atrium. Surgical removal of the leads was performed without using cardiopulmonary bypass and the leads were removed without any complications. During surgery, we found there was a silent perforation of the innominate vein brought about by the Excimer Laser Sheath System. Also, the junction of the SVC with the right atrium was thought to be an area potentially at high risk of perforation, because of a lack of surrounding tissue. It is our opinion that those who carry out procedures with the Excimer Laser Sheath System should understand the potential risk of perforation based on cardiac anatomy and should be prepared for lethal bleeding complications. Also, for emergent situations, we believe that close backup by a cardiovascular surgical team should be considered essential for performing the Excimer Laser Sheath Lead Extraction safely.

  4. Optimization of protein extraction process from jackfruit seed flour by reverse micelle system

    Directory of Open Access Journals (Sweden)

    Maycon Fagundes Teixeira Reis

    2016-06-01

    Full Text Available The extraction of protein from flour of jackfruit seeds by reverse micelles was evaluated. Reverse micelle system was composed of sodium dodecyl sulfate (SDS as surfactant, butanol as solvent, and water. The effects of stirring time, temperature, molar ratio H2O SDS-1, concentration of butanol (mass percentage and flour mass were tested in batch systems. Based on the adjusted linear regression model, only butanol concentration provided optimum extraction conditions (41.16%. Based on the analysis of surface response, the best extraction yield could be obtained at 25°C, stirring time of 120 min, mass of flour of 100 mg, and a ratio H2O SDS-1 of 50. Experimental results showed that a 79.00% extraction yield could be obtained.

  5. Feature extraction for deep neural networks based on decision boundaries

    Science.gov (United States)

    Woo, Seongyoun; Lee, Chulhee

    2017-05-01

    Feature extraction is a process used to reduce data dimensions using various transforms while preserving the discriminant characteristics of the original data. Feature extraction has been an important issue in pattern recognition since it can reduce the computational complexity and provide a simplified classifier. In particular, linear feature extraction has been widely used. This method applies a linear transform to the original data to reduce the data dimensions. The decision boundary feature extraction method (DBFE) retains only informative directions for discriminating among the classes. DBFE has been applied to various parametric and non-parametric classifiers, which include the Gaussian maximum likelihood classifier (GML), the k-nearest neighbor classifier, support vector machines (SVM) and neural networks. In this paper, we apply DBFE to deep neural networks. This algorithm is based on the nonparametric version of DBFE, which was developed for neural networks. Experimental results with the UCI database show improved classification accuracy with reduced dimensionality.

  6. Skeleton extraction based on the topology and Snakes model

    Science.gov (United States)

    Cai, Yuanxue; Ming, Chengguo; Qin, Yueting

    A new skeleton line extraction method based on topology and flux is proposed by analyzing the distribution characteristics of the gradient vector field in the Snakes model. The distribution characteristics of the skeleton line are accurately obtained by calculating the eigenvalues of the critical points and the flux of the gradient vector field. Then the skeleton lines can be effectively extracted. The results also show that there is no need for the pretreatment or binarization of the target image. The skeleton lines of complex gray images such as optical interference patterns can be effectively extracted by using this method. Compared to traditional methods, this method has many advantages, such as high extraction accuracy and fast processing speed.

  7. Hexafluoroisopropanol-induced catanionic-surfactants-based coacervate extraction for analysis of lysozyme.

    Science.gov (United States)

    Xu, Jia; Niu, Manli; Xiao, Yuxiu

    2017-02-01

    A coacervate extraction method, based on hexafluoroisopropanol (HFIP)-induced catanionic surfactants and coupled with a back-extraction procedure, was developed for separation and purification of proteins, using sodium dodecyl sulfate (SDS) and dodecyltrimethyl ammonium bromide (DTAB) as representative catanionic surfactants and lysozyme as a model protein. After the coacervate extraction and back extraction, the obtained lysozyme solutions were examined in terms of quantitative analysis by capillary electrophoresis, bacteriolytic activity, and circular dichroism (CD). The effects of several parameters including back-extraction solvent, HFIP content, total surfactant concentration, and SDS/DTAB molar ratio were investigated in detail on the extraction efficiency and activity of lysozyme. Under the optimized extraction conditions (66 mM KH2PO4 buffer with pH 6.2 as back-extraction solvent, SDS/DTAB molar ratio = 1:1 mol/mol, total surfactant concentration = 30 mM, HIFP concentration = 8 % v/v), the extraction recovery was 89.8 % (±4.7, n = 3), limit of detection was 2.2 (±0.3, n = 3) μg mL(-1), and meanwhile nearly 65 % of native lysozyme activity was retained. In addition, the activity and CD assays showed that SDS/DTAB molar ratio had a significant influence on the activity and structure of lysozyme after extraction. The DTAB-rich extraction systems, in which the DTAB mole fraction was equal to or larger than 70 %, could keep the activity and structure of lysozyme almost in the native state. Graphical Abstract Procedure of HFIP-induced SDS/DTAB coacervate extraction and back extraction of lysozyme.

  8. Support patient search on pathology reports with interactive online learning based data extraction

    Directory of Open Access Journals (Sweden)

    Shuai Zheng

    2015-01-01

    Full Text Available Background: Structural reporting enables semantic understanding and prompt retrieval of clinical findings about patients. While synoptic pathology reporting provides templates for data entries, information in pathology reports remains primarily in narrative free text form. Extracting data of interest from narrative pathology reports could significantly improve the representation of the information and enable complex structured queries. However, manual extraction is tedious and error-prone, and automated tools are often constructed with a fixed training dataset and not easily adaptable. Our goal is to extract data from pathology reports to support advanced patient search with a highly adaptable semi-automated data extraction system, which can adjust and self-improve by learning from a user′s interaction with minimal human effort. Methods : We have developed an online machine learning based information extraction system called IDEAL-X. With its graphical user interface, the system′s data extraction engine automatically annotates values for users to review upon loading each report text. The system analyzes users′ corrections regarding these annotations with online machine learning, and incrementally enhances and refines the learning model as reports are processed. The system also takes advantage of customized controlled vocabularies, which can be adaptively refined during the online learning process to further assist the data extraction. As the accuracy of automatic annotation improves overtime, the effort of human annotation is gradually reduced. After all reports are processed, a built-in query engine can be applied to conveniently define queries based on extracted structured data. Results: We have evaluated the system with a dataset of anatomic pathology reports from 50 patients. Extracted data elements include demographical data, diagnosis, genetic marker, and procedure. The system achieves F-1 scores of around 95% for the majority of

  9. Upgrade of the ITUR extraction system at ESS-Bilbao

    Science.gov (United States)

    Izaola, Zunbeltz; Zugazaga, Aitor; Feuchtwanger, Jorge; Fernández-Cañoto, David; Bustinduy, Ibon; Munoz, Juan Luis; Faircloth, Dan; Lawrie, Scott R.

    2013-02-01

    The first beam measurements on our modified version of the ISIS Penning source show a beam of relatively low current. As a result of this, the actual extraction system was simulated using IBSimu, and it was found that the configuration is far from the optimal case. We present a simpler post-acceleration extraction system that avoids the use of a long (˜100mm) Cs trap. Due to space and budget constraints, the new extraction is composed of only one electrostatic einzel lens. The same configuration, as the ISIS source, is maintained up to the puller electrode; the changes come afterwards, where two circular electrodes with rectangular apertures make up the einzel lens. This configuration lacks the bending magnet found at ISIS because the permanent magnets used in this version of the source provide the Penning field. This difference results in a low angle beam extraction that is compensated by tilting the source to angles near 15°. In addition to the beam dynamics simulations, the mechanical and electrostatic simulations for the extraction system are presented.

  10. Generating Unstable Resonances for Extraction Schemes Based on Transverse Splitting

    CERN Document Server

    Giovannozzi, M; Turchetti, G

    2009-01-01

    A few years ago, a novel multi-turn extraction scheme was proposed, based on particle trapping inside stable resonances. Numerical simulations and experimental tests have confirmed the feasibility of such a scheme for low order resonances. While the third-order resonance is generically unstable and those higher than fourth-order are generically stable, the fourth-order resonance can be either stable or unstable depending on the specifics of the system under consideration. By means of the Normal Form a general approach to control the stability of the fourth-order resonance has been derived. This approach is based on the control of the amplitude detuning and the general form for a lattice with an arbitrary number of sextupole and octupole families is derived in this paper. Numerical simulations have confirmed the analytical results and have shown that, when crossing the unstable fourth-order resonance, the region around the centre of the phase space is depleted and particles are trapped in only the four stable ...

  11. Ontology-based Knowledge Extraction from Hidden Web

    Institute of Scientific and Technical Information of China (English)

    SONG Hui; MA Fan-yuan; LIU Xiao-qiang

    2004-01-01

    Hidden Web provides great amount of domain-specific data for constructing knowledge services. Most previous knowledge extraction researches ignore the valuable data hidden in Web database, and related works do not refer how to make extracted information available for knowledge system. This paper describes a novel approach to build a domain-specific knowledge service with the data retrieved from Hidden Web. Ontology serves to model the domain knowledge. Queries forms of different Web sites are translated into machine-understandable format, defined knowledge concepts, so that they can be accessed automatically. Also knowledge data are extracted from Web pages and organized in ontology format knowledge. The experiment proves the algorithm achieves high accuracy and the system facilitates constructing knowledge services greatly.

  12. PURIFICATION OF COBALT ANOLYTE USING THE NOVEL SOLVENT EXTRACTION SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Y.F. Shen; W.Y. Xue; W. Y. Niu

    2003-01-01

    In present research, a novel extractant system (D2EHPA + naphthenic acid +pyridine-ester) was used to purify cobalt anolyte and a simulated industrial produc-tion were carried out. This novel extraction system can extract Cu and/or Ni againstCo from chloride medium solutions at pH range of 2.5-4.5. About 2g/l nickel and0.2g/l copper were removed from the cobalt chloride anolyte containing about 100g/lcobalt and 200g/l chloride ions respectively, the raffinate contains nickel and copperless than 0.03g/l and 0. 0003g/l respectively and can be used to electrolyze high-puritycobalt. About 5.5t cobalt anolyte was purified in the simulation industrial experimentand kilogram quantities of cobalt of 99.98% purity and about 95% recovery have beenproduced.

  13. Silica-Based Solid Phase Extraction of DNA on a Microchip

    Institute of Scientific and Technical Information of China (English)

    陈晓芳; 沈科跃; 刘鹏; 郭旻; 程京; 周玉祥

    2004-01-01

    Micro total analysis systems for chemical and biological analysis have attracted much attention.However,microchips for sample preparation and especially DNA purification are still underdeveloped.This work describes a solid phase extraction chip for purifying DNA from biological samples based on the adsorption of DNA on bare silica beads prepacked in a microchannel.The chip was fabricated with poly-dimethylsiloxane.The silica beads were packed in the channel on the chip with a tapered microchannel to form the packed bed.Fluorescence detection was used to evaluate the DNA adsorbing efficiency of the solid phase.The polymerase chain reaction was used to evaluate the quality of the purified DNA for further use.The extraction efficiency for the DNA extraction chip is approximately 50% with a 150-nL extraction volume.Successful amplification of DNA extracted from human whole blood indicates that this method is compatible with the polymerase chain reaction.

  14. Central nervous system activity of Illicium verum fruit extracts.

    Science.gov (United States)

    Chouksey, Divya; Upmanyu, Neeraj; Pawar, R S

    2013-11-01

    To research the acute toxicity of Illicium verum (I. verum) fruit extracts and its action on central nervous system. The TLC and HPTLC techniques were used as fingerprints to determine the chemical components present in I. verum. Male albino rats and mice were utilized for study. The powdered material was successively extracted with n-hexane, ethyl acetate and methanol using a Soxhlet extractor. Acute toxicity studies were performed as per OECD guidelines. The CNS activity was evaluated on parameters of general behavior, sleeping pattern, locomotor activity, anxiety and myocoordination activity. The animals were trained for seven days prior to experiments and the divided into five groups with six animals in each. The drug was administered by intraperitoneal route according to body weight. The dosing was done as prescribed in each protocol. Toxicity studies reported 2 000 mg/kg as toxicological dose and 1/10 of the same dose was taken as therapeutic dose Intraperitoneal injection of all extracts at dose of 200 mg prolonged phenobarbitone induced sleeping time, produced alteration in general behavior pattern, reduced locomotor activity and produced anxiolytic effects but the extracts do not significantly alter muscles coordination activity. The three extracts of I. verum at the dose of 200 mg, methanol extract was found to produce more prominent effects, then hexane and ethylacetate extracts. The observation suggested that the extracts of I. verum possess potent CNS depressant action and anxiolytic effect without interfering with motor coordination. Copyright © 2013 Hainan Medical College. Published by Elsevier B.V. All rights reserved.

  15. Central nervous system activity ofIllicium verum fruit extracts

    Institute of Scientific and Technical Information of China (English)

    Divya Chouksey; Neeraj Upmanyu; RS Pawar

    2013-01-01

    Objective:To research the acute toxicity of Illicium verum(I. verum) fruit extracts and its action on central nervous system.Methods:TheTLC andHPTLC techniques were used as fingerprints to determine the chemical components present in I. verum.Male albino rats and mice were utilized for study.The powdered material was successively extracted withn-hexane, ethyl acetate and methanol using aSoxhlet extractor.Acute toxicity studies were performed as per OECD guidelines.TheCNS activity was evaluated on parameters of general behavior, sleeping pattern, locomotor activity, anxiety and myocoordination activity.The animals were trained for seven days prior to experiments and the divided into five groups with six animals in each.The drug was administered by intraperitoneal route according to body weight.The dosing was done as prescribed in each protocol.Results:Toxicity studies reported2000 mg/kg as toxicological dose and1/10 of the same dose was taken as therapeutic doseIntraperitoneal injection of all extracts at dose of200 mg prolonged phenobarbitone induced sleeping time, produced alteration in general behavior pattern, reduced locomotor activity and produced anxiolytic effects but the extracts do not significantlyalter muscles coordination activity.The three extracts of I. verum at the dose of200 mg, methanol extract was found to produce more prominent effects, then hexane and ethylacetate extracts.Conclusions:The observation suggested that the extracts ofI. verum possess potentCNS depressant action and anxiolytic effect without interfering with motor coordination.

  16. Model-based Bayesian signal extraction algorithm for peripheral nerves

    Science.gov (United States)

    Eggers, Thomas E.; Dweiri, Yazan M.; McCallum, Grant A.; Durand, Dominique M.

    2017-10-01

    Objective. Multi-channel cuff electrodes have recently been investigated for extracting fascicular-level motor commands from mixed neural recordings. Such signals could provide volitional, intuitive control over a robotic prosthesis for amputee patients. Recent work has demonstrated success in extracting these signals in acute and chronic preparations using spatial filtering techniques. These extracted signals, however, had low signal-to-noise ratios and thus limited their utility to binary classification. In this work a new algorithm is proposed which combines previous source localization approaches to create a model based method which operates in real time. Approach. To validate this algorithm, a saline benchtop setup was created to allow the precise placement of artificial sources within a cuff and interference sources outside the cuff. The artificial source was taken from five seconds of chronic neural activity to replicate realistic recordings. The proposed algorithm, hybrid Bayesian signal extraction (HBSE), is then compared to previous algorithms, beamforming and a Bayesian spatial filtering method, on this test data. An example chronic neural recording is also analyzed with all three algorithms. Main results. The proposed algorithm improved the signal to noise and signal to interference ratio of extracted test signals two to three fold, as well as increased the correlation coefficient between the original and recovered signals by 10-20%. These improvements translated to the chronic recording example and increased the calculated bit rate between the recovered signals and the recorded motor activity. Significance. HBSE significantly outperforms previous algorithms in extracting realistic neural signals, even in the presence of external noise sources. These results demonstrate the feasibility of extracting dynamic motor signals from a multi-fascicled intact nerve trunk, which in turn could extract motor command signals from an amputee for the end goal of

  17. Hydrogel based occlusion systems

    NARCIS (Netherlands)

    Stam, F.A.; Jackson, N.; Dubruel, P.; Adesanya, K.; Embrechts, A.; Mendes, E.; Neves, H.P.; Herijgers, P.; Verbrugghe, Y.; Shacham, Y.; Engel, L.; Krylov, V.

    2013-01-01

    A hydrogel based occlusion system, a method for occluding vessels, appendages or aneurysms, and a method for hydrogel synthesis are disclosed. The hydrogel based occlusion system includes a hydrogel having a shrunken and a swollen state and a delivery tool configured to deliver the hydrogel to a tar

  18. Extraction of Micronutrient Metals from Peat-based Media Using Various Chelate-ligand and Iron-source Extractants

    Science.gov (United States)

    Objectives of the study were to determine effects of chelate-ligand (experiment 1) and iron-source (experiment 2) unbuffrered extractant solutions on substrate pH and Cu, Fe, Mn, and Zn extraction from peat-based media. Chelate-ligand extractants consisted of 5 mM solutions of ethylenediaminedisucc...

  19. Development of a flat membrane based device for electromembrane extraction

    DEFF Research Database (Denmark)

    Huang, Chuixiu; Eibak, Lars Erik Eng; Gjelstad, Astrid

    2014-01-01

    In this work, a single-well electromembrane extraction (EME) device was developed based on a thin (100μm) and flat porous membrane of polypropylene supporting a liquid membrane. The new EME device was operated with a relatively large acceptor solution volume to promote a high recovery. Using this...

  20. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...

  1. Evaluation of Sorghum bicolor leaf base extract for gastrointestinal ...

    African Journals Online (AJOL)

    PRECIOUS

    2009-11-02

    Nov 2, 2009 ... The effects of acetylcholine and the leaf base extract were tested on the strips of ..... (0.04 - 5.12 mg/ml) contracted the smooth muscles of rat stomach ... motility drugs such as loperamide block the actions of castor oil and are ...

  2. DNA Extraction from Eriocaulon Plants and Construction of RAPD System

    Institute of Scientific and Technical Information of China (English)

    Xue Xian; Lin Shanzhi; Zhang Zhixiang

    2004-01-01

    There have been many arguments on the classification of Eriocaulon Linn. by morphology so far, and little is known about the use of molecular marker for genetic for genetic diversity of Eriocaulon plants. To apply the technique of molecular marker to the research of genetic diversity of Eriocaulon plants, the study of the extraction method of DNA from the Eriocaulon plants and the RAPD system are essential for researchers. In this paper, the extraction of genome DNA from the silica-gel-dried leaves of several species of Eriocaulon distributed in China was studied, and the best RAPD analysis technique condition of Eriocaulon plants was analyzed.

  3. Applicability of an in-House Saponin-Based Extraction Method in Bruker Biotyper Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry System for Identification of Bacterial and Fungal Species in Positively Flagged Blood Cultures

    Directory of Open Access Journals (Sweden)

    Jung-Yien Chien

    2016-09-01

    Full Text Available We used an in-house saponin-based extraction method to evaluate the performance of the Bruker Biotyper matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF/MS system for the identification of bacteria and fungi in 405 positively flagged blood culture bottles. Results obtained from MALDI-TOF/MS were compared with those obtained using conventional phenotypic identification methods. Of the 405 positively flagged blood culture bottles, 365 showed monomicrobal growth and were correctly identified to the species (72.1% or genus (89.6% level using the Bruker Biotyper system. The remaining 40 positively flagged blood culture bottles showed polymicrobial growth. Of them, 82.5% (n=33 of the isolates were correctly identified to the species level and 92.5% (n=37 to the genus level using the Bruker Biotyper system. The overall accuracy of identification to the genus level in flagged blood cultures was 89.5% for Gram-positive organisms, 93.5% for Gram-negative pathogens and 71.9% for fungi. Confidence scores were 1.500 for 307 (75.8% bottles, 1.700 for 249 (61.5% bottles and 2.000 for 142 (35.1% bottles. None of the yeast cultures yielded scores 1.700. Using an identification-score cutoff of 1.500, the MALDI Biotyper correctly identified 99.2% of Gram-positive bacteria, 97.6% of Gram-negative bacteria and 100% of yeast isolates to the genus level and 77.6% of Gram-positive bacteria, 87.1% of Gram-negative bacteria and 100.0% of yeast isolates to the species level. The overall rate of identification using our protocol was 89.9% (364/405 for genus level identification and 73.1% (296/405 for species level identification. Yeast isolates yielded the lowest confidence scores, which compromised the accuracy of identification. Further optimization of the protein extraction procedure in positive blood cultures is needed to improve the rate of identification.

  4. 基于改进的EMD方法提取车辆-轨道垂向耦合系统动态特性%Dynamic characteristics extraction of vehicle-track vertically coupling system based on improved EMD

    Institute of Scientific and Technical Information of China (English)

    陈双喜; 林建辉; 陈建政

    2011-01-01

    In order to extract dynamic characteristics of vehicle-track coupling system, the method of improved empirical mode decomposition (IEMD) was presented. In the method, the field mean of extrema was adopted instead of the points envelope mean and the boundary wave matching algorithm was used to restrain the end effect. Based on the theory of vehicle-track coupling dynamics, the vehicle ballastless-track vertically coupling dynamics model was established. With the rail corrugation and rail irregularity excitation model, dynamic responses of vehicle-track coupling system were calculated at different driving speed. The dynamic responses were decomposed by IEMD, and then the intrinsic mode functions of vertical contact force, bogie acceleration, and car body acceleration were analyzed and compared. The results show: the IEMD can self-adaptively decompose vibration signals to intrinsic mode functions and so effectively extract dynamic characteristics of vehicle-track coupling system.%提出利用一种改进的经验模态分解(EMD)方法提取车辆-轨道耦合系统的动力学特性.该方法以改进的极值域均值代替极值点包络线的均值来提高局部均值的求解精度,以边界波形匹配预测法来抑制端点效应.基于车辆-轨道耦合动力学理论,建立客车-弹性支承块无砟轨道垂向耦合动力学模型,计算车辆-轨道耦合系统在波浪形磨损和轨道不平顺组合激励模型下的振动响应.运用改进的EMD方法对系统的振动响应进行经验模态分解,并且对轮轨力、转向架和车体加速度的本征函数进行分析和比较.研究结果表明:改进的EMD方法自适应地将振动响应分解成本征函数,能有效地提取车辆-轨道耦合系统的动力学特性.

  5. Extraction of Spatial-Temporal Features for Vision-Based Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    HUANG YU; XU Guangyou; ZHU Yuanxin

    2000-01-01

    One of the key problems in a vision-based gesture recognition system is the extraction of spatial-temporal features of gesturing.In this paper an approach of motion-based segmentation is proposed to realize this task.The direct method cooperated with the robust M-estimator to estimate the affine parameters of gesturing motion is used, and based on the dominant motion model the gesturing region is extracted, i.e.,the dominant object. So the spatial-temporal features of gestures can be extracted. Finally, the dynamic time warping (DTW) method is directly used to perform matching of 12 control gestures (6 for"translation"orders,6 for"rotation"orders).A small demonstration system has been set up to verify the method, in which a panorama image viewer can be controlled (set by mosaicing a sequence of standard"Garden"images) with recognized gestures instead of the 3-D mouse tool.

  6. Sequential Clustering based Facial Feature Extraction Method for Automatic Creation of Facial Models from Orthogonal Views

    CERN Document Server

    Ghahari, Alireza

    2009-01-01

    Multiview 3D face modeling has attracted increasing attention recently and has become one of the potential avenues in future video systems. We aim to make more reliable and robust automatic feature extraction and natural 3D feature construction from 2D features detected on a pair of frontal and profile view face images. We propose several heuristic algorithms to minimize possible errors introduced by prevalent nonperfect orthogonal condition and noncoherent luminance. In our approach, we first extract the 2D features that are visible to both cameras in both views. Then, we estimate the coordinates of the features in the hidden profile view based on the visible features extracted in the two orthogonal views. Finally, based on the coordinates of the extracted features, we deform a 3D generic model to perform the desired 3D clone modeling. Present study proves the scope of resulted facial models for practical applications like face recognition and facial animation.

  7. Weld Defect Extraction Based on Adaptive Morphology Filtering and Edge Detection by Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    WANGDonghua; ZHOUYuanhua; GANGTie

    2003-01-01

    One of the most key steps in X-ray au-tomatic inspection and intelligent recognition systems is how to extract defects and detect their edges effectively.In this paper, a novel method of defect extraction based on the adaptive morphology filtering (DEAMF) is pro-posed, whose structuring elements can be changed with the sizes of defects adaptively. By this method, defects in X-ray weld inspection images are extracted with well-kept shapes and high speeds. Then according to the theory of edge detection based on wavelet transform modulus max-ima, a locally supported wavelet with good antisymmetry is developed to extract edges of defects and the results are satisfying.

  8. Ensemble Feature Extraction Modules for Improved Hindi Speech Recognition System

    Directory of Open Access Journals (Sweden)

    Malay Kumar

    2012-05-01

    Full Text Available Speech is the most natural way of communication between human beings. The field of speech recognition generates intrigues of man - machine conversation and due to its versatile applications; automatic speech recognition systems have been designed. In this paper we are presenting a novel approach for Hindi speech recognition by ensemble feature extraction modules of ASR systems and their outputs have been combined using voting technique ROVER. Experimental results have been shown that proposed system will produce better result than traditional ASR systems.

  9. 基于关键帧提取技术的花开过程视频监测系统开发及试验%Development and experiment of blooming video monitoring system based on key frame extraction method

    Institute of Scientific and Technical Information of China (English)

    高林; 王璐; 闫磊; 张军国

    2014-01-01

    为克服传统花开过程监测中视频数据冗余、信息量大的缺点,该文设计了一种基于关键帧提取技术的花开过程视频监测系统。系统对花开过程原始图像采集后,采用基于光流法和熵统计算法实现对花开过程原始图像的关键帧提取,选择关键帧数模式或方向信息熵阈值模式,进行相关参数设置,最终合成出表征花开过程的关键帧视频。该文以百合花开放过程为例实现了基于关键帧提取技术的视频监测。试验结果证明,在该试验的条件下经关键帧提取合成的花开过程的视频数据量减少达84.6%以上,播放时间减少为原始视频播放时间的15.4%以下。视频保留了花开过程细节信息,整段视频播放自然流畅,可为从事植物、花卉研究的相关人员提供一个省时、方便的研究花开过程的监测平台。%The drawback to original video of the blooming process is that it contains a large amount of data and redundant information. In order to provide researchers with a video for monitoring which is endowed with a high compression ratio, small amount of data, rich growth detailed information and natural fluency, a blooming video monitoring system based on a key frame extraction method was developed in this paper. System hardware included:one Personal Computer, Central Processing unit:Intel ® CPU T2300@1.66GHz, 1.24 G memory, one Microsoft high-definition cameras HD-3000、one shading carton box, and one DC LED lamp, etc., software development environment: WinXP Operating System, Microsoft Visual Studio 2008 Professional, OpenCV2.0. This system can be divided into five function modules: image acquisition module, core algorithm module, key frames judgment module、data storage examine module、and video composition preview module. The core of the system is the key frame retrieval method. This method is based on the flower growth characteristics. For example, the background for

  10. A harmonic linear dynamical system for prominent ECG feature extraction.

    Science.gov (United States)

    Thi, Ngoc Anh Nguyen; Yang, Hyung-Jeong; Kim, SunHee; Do, Luu Ngoc

    2014-01-01

    Unsupervised mining of electrocardiography (ECG) time series is a crucial task in biomedical applications. To have efficiency of the clustering results, the prominent features extracted from preprocessing analysis on multiple ECG time series need to be investigated. In this paper, a Harmonic Linear Dynamical System is applied to discover vital prominent features via mining the evolving hidden dynamics and correlations in ECG time series. The discovery of the comprehensible and interpretable features of the proposed feature extraction methodology effectively represents the accuracy and the reliability of clustering results. Particularly, the empirical evaluation results of the proposed method demonstrate the improved performance of clustering compared to the previous main stream feature extraction approaches for ECG time series clustering tasks. Furthermore, the experimental results on real-world datasets show scalability with linear computation time to the duration of the time series.

  11. A Harmonic Linear Dynamical System for Prominent ECG Feature Extraction

    Directory of Open Access Journals (Sweden)

    Ngoc Anh Nguyen Thi

    2014-01-01

    Full Text Available Unsupervised mining of electrocardiography (ECG time series is a crucial task in biomedical applications. To have efficiency of the clustering results, the prominent features extracted from preprocessing analysis on multiple ECG time series need to be investigated. In this paper, a Harmonic Linear Dynamical System is applied to discover vital prominent features via mining the evolving hidden dynamics and correlations in ECG time series. The discovery of the comprehensible and interpretable features of the proposed feature extraction methodology effectively represents the accuracy and the reliability of clustering results. Particularly, the empirical evaluation results of the proposed method demonstrate the improved performance of clustering compared to the previous main stream feature extraction approaches for ECG time series clustering tasks. Furthermore, the experimental results on real-world datasets show scalability with linear computation time to the duration of the time series.

  12. EXTRACTION SYSTEM DESIGN FOR THE BSNS/RCS.

    Energy Technology Data Exchange (ETDEWEB)

    WEI, J.; CHEN, Y.; CHI, Y.L.; JIANG, Y.L.; KANG, W.; PANG, J.B.; QIN, Q.; WANG, S.; WANG, W.

    2006-06-23

    The BSNS extraction system takes use one of the four dispersion-free straight sections. Five vertical kickers and one Lambertson septum magnet are used for the one-turn extraction. The rise time of less 250 ns and the total kicking angle of 20 mrad are required for the kickers that are grouped into two tanks. The design for the kicker magnets and the PFN is also given. To reduce the low beam loss in the extraction channels due to large halo emittance, large apertures are used for both the kickers and septum. Stray magnetic field inside and at the two ends of the circulating path of the Lambertson magnet and its effect to the beam has been studied.

  13. Solvent extraction of gold using ionic liquid based process

    Science.gov (United States)

    Makertihartha, I. G. B. N.; Zunita, Megawati; Rizki, Z.; Dharmawijaya, P. T.

    2017-01-01

    In decades, many research and mineral processing industries are using solvent extraction technology for metal ions separation. Solvent extraction technique has been used for the purification of precious metals such as Au and Pd, and base metals such as Cu, Zn and Cd. This process uses organic compounds as solvent. Organic solvents have some undesired properties i.e. toxic, volatile, excessive used, flammable, difficult to recycle, low reusability, low Au recovery, together with the problems related to the disposal of spent extractants and diluents, even the costs associated with these processes are relatively expensive. Therefore, a lot of research have boosted into the development of safe and environmentally friendly process for Au separation. Ionic liquids (ILs) are the potential alternative for gold extraction because they possess several desirable properties, such as a the ability to expanse temperature process up to 300°C, good solvent properties for a wide range of metal ions, high selectivity, low vapor pressures, stability up to 200°C, easy preparation, environmentally friendly (commonly called as "green solvent"), and relatively low cost. This review paper is focused in investigate of some ILs that have the potentials as solvent in extraction of Au from mineral/metal alloy at various conditions (pH, temperature, and pressure). Performances of ILs extraction of Au are studied in depth, i.e. structural relationship of ILs with capability to separate Au from metal ions aggregate. Optimal extraction conditon in order to gain high percent of Au in mineral processing is also investigated.

  14. Surface Electromyography Feature Extraction Based on Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Farzaneh Akhavan Mahdavi

    2012-12-01

    Full Text Available Considering the vast variety of EMG signal applications such as rehabilitation of people suffering from some mobility limitations, scientists have done much research on EMG control system. In this regard, feature extraction of EMG signal has been highly valued as a significant technique to extract the desired information of EMG signal and remove unnecessary parts. In this study, Wavelet Transform (WT has been applied as the main technique to extract Surface EMG (SEMG features because WT is consistent with the nature of EMG as a nonstationary signal. Furthermore, two evaluation criteria, namely, RES index (the ratio of a Euclidean distance to a standard deviation and scatter plot are recruited to investigate the efficiency of wavelet feature extraction. The results illustrated an improvement in class separability of hand movements in feature space. Accordingly, it has been shown that only the SEMG features extracted from first and second level of WT decomposition by second order of Daubechies family (db2 yielded the best class separability.

  15. Approach to extracting hot topics based on network traffic content

    Institute of Scientific and Technical Information of China (English)

    Yadong ZHOU; Xiaohong GUAN; Qindong SUN; Wei LI; Jing TAO

    2009-01-01

    This article presents the formal definition and description of popular topics on the Internet,analyzes the relationship between popular words and topics,and finally introduces a method that uses statistics and correlation of the popular words in traffic content and network flow characteristics as input for extracting popular topics on the Internet.Based on this,this article adapts a clustering algorithm to extract popular topics and gives formalized results.The test results show that this method has an accuracy of 16.7% in extracting popular topics on the Internet.Compared with web mining and topic detection and tracking (TDT),it can provide a more suitable data source for effective recovery of Internet public opinions.

  16. Overlaid caption extraction in news video based on SVM

    Science.gov (United States)

    Liu, Manman; Su, Yuting; Ji, Zhong

    2007-11-01

    Overlaid caption in news video often carries condensed semantic information which is key cues for content-based video indexing and retrieval. However, it is still a challenging work to extract caption from video because of its complex background and low resolution. In this paper, we propose an effective overlaid caption extraction approach for news video. We first scan the video key frames using a small window, and then classify the blocks into the text and non-text ones via support vector machine (SVM), with statistical features extracted from the gray level co-occurrence matrices, the LH and HL sub-bands wavelet coefficients and the orientated edge intensity ratios. Finally morphological filtering and projection profile analysis are employed to localize and refine the candidate caption regions. Experiments show its high performance on four 30-minute news video programs.

  17. Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds.

    Science.gov (United States)

    Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun

    2016-06-17

    Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average

  18. Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds†

    Science.gov (United States)

    Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun

    2016-01-01

    Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average

  19. Controlled release of an extract of Calendula officinalis flowers from a system based on the incorporation of gelatin-collagen microparticles into collagen I scaffolds: design and in vitro performance.

    Science.gov (United States)

    Jiménez, Ronald A; Millán, Diana; Suesca, Edward; Sosnik, Alejandro; Fontanilla, Marta R

    2015-06-01

    Aiming to develop biological skin dresses with improved performance in the treatment of skin wounds, acellular collagen I scaffolds were modified with polymeric microparticles and the subsequent loading of a hydroglycolic extract of Calendula officinalis flowers. Microparticles made of gelatin-collagen were produced by a water-in-oil emulsion/cross-linking method. Thereafter, these microparticles were mixed with collagen suspensions at three increasing concentrations and the resulting mixtures lyophilized to make microparticle-loaded porous collagen scaffolds. Resistance to enzymatic degradation, ability to associate with the C. officinalis extract, and the extract release profile of the three gelatin-collagen microparticle-scaffold prototypes were assessed in vitro and compared to collagen scaffolds without microparticles used as control. Data indicated that the incorporation of gelatin-collagen microparticles increased the resistance of the scaffolds to in vitro enzymatic degradation, as well as their association with the C. officinalis flower extract. In addition, a sharp decrease in cytotoxicity, as well as more prolonged release of the extract, was attained. Overall results support the potential of these systems to develop innovative dermal substitutes with improved features. Furthermore, the gelatin-collagen mixture represents a low-cost and scalable alternative with high clinical transferability, especially appealing in developing countries.

  20. Design of a soft sensor system for Puerarin extraction based on SVM%基于SVM的葛根素提取软测量系统的设计

    Institute of Scientific and Technical Information of China (English)

    齐岩磊; 陈娟; 杨祺; 祁欣

    2012-01-01

    Since the high efficiency detection of the active components by ultrasonic from medicinal plants is difficult, this paper adopts the method of a soft sensor model based on particle swarm optimization and support vector machine algorithm in order to estimate and infer the values of measured variables. Using a microcontroller unit for data of auxiliary variables collection and processing, the microcontroller can calculate and build the soft sensor model and the hardware realization of the model. Therefore, the on-line detection of the rate of extraction of medicinal plants by ultrasonic is implemented. The soft sensor system overcomes the shortcomings associated with the used method of off-line sample detection by UV spectrometry, namely the large workload and inability to measure the extraction rate directly. The new system has the following advantageous features: high accuracy (relative errors within 3%), good generality, fast response, and good real-time performance.%针对超声波高效率提取植物根茎类药材活性成分含量难以实时在线检测的问题,采用基于粒子群算法和支持向量机相结合的软测量建模方法.通过对辅助变量的测量来建立软测量预测模型,从而估计和推断待测量变量的值,并利用单片机进行辅助变量的数据采集和处理、软测量模型的计算及建立和模型的硬件实现,从而实现了对超声波提取植物根茎类药材葛根素的在线测量.与目前采用紫外分光光度计分时离线取样进行检测相比,该方法克服了无法直接得到提取率测量值、离线测量工作量大等缺点,具有实时性好、测量精度较高(测量相对误差控制在3%以内)和响应速度快等特点.

  1. Field—Based Supercritical Fluid Extraction of Hydrocarbons at Industrially Contaminated Sites

    Directory of Open Access Journals (Sweden)

    Peggy Rigou

    2002-01-01

    Full Text Available Examination of organic pollutants in groundwaters should also consider the source of the pollution, which is often a solid matrix such as soil, landfill waste, or sediment. This premise should be viewed alongside the growing trend towards field-based characterisation of contaminated sites for reasons of speed and cost. Field-based methods for the extraction of organic compounds from solid samples are generally cumbersome, time consuming, or inefficient. This paper describes the development of a field-based supercritical fluid extraction (SFE system for the recovery of organic contaminants (benzene, toluene, ethylbenzene, and xylene and polynuclear aromatic hydrocarbons from soils. A simple, compact, and robust SFE system has been constructed and was found to offer the same extraction efficiency as a well-established laboratory SFE system. Extraction optimisation was statistically evaluated using a factorial analysis procedure. Under optimised conditions, the device yielded recovery efficiencies of >70% with RSD values of 4% against the standard EPA Soxhlet method, compared with a mean recovery efficiency of 48% for a commercially available field-extraction kit. The device will next be evaluated with real samples prior to field deployment.

  2. An investigation of paper based microfluidic devices for size based separation and extraction applications.

    Science.gov (United States)

    Zhong, Z W; Wu, R G; Wang, Z P; Tan, H L

    2015-09-01

    Conventional microfluidic devices are typically complex and expensive. The devices require the use of pneumatic control systems or highly precise pumps to control the flow in the devices. This work investigates an alternative method using paper based microfluidic devices to replace conventional microfluidic devices. Size based separation and extraction experiments conducted were able to separate free dye from a mixed protein and dye solution. Experimental results showed that pure fluorescein isothiocyanate could be separated from a solution of mixed fluorescein isothiocyanate and fluorescein isothiocyanate labeled bovine serum albumin. The analysis readings obtained from a spectrophotometer clearly show that the extracted tartrazine sample did not contain any amount of Blue-BSA, because its absorbance value was 0.000 measured at a wavelength of 590nm, which correlated to Blue-BSA. These demonstrate that paper based microfluidic devices, which are inexpensive and easy to implement, can potentially replace their conventional counterparts by the use of simple geometry designs and the capillary action. These findings will potentially help in future developments of paper based microfluidic devices.

  3. Hydrogel based occlusion systems

    OpenAIRE

    Stam, F.A.; Jackson, N.; Dubruel, P.; Adesanya, K.; Embrechts, A; Mendes, E.; Neves, H.P.; Herijgers, P; Verbrugghe, Y.; Shacham, Y.; Engel, L.; Krylov, V

    2013-01-01

    A hydrogel based occlusion system, a method for occluding vessels, appendages or aneurysms, and a method for hydrogel synthesis are disclosed. The hydrogel based occlusion system includes a hydrogel having a shrunken and a swollen state and a delivery tool configured to deliver the hydrogel to a target occlusion location. The hydrogel is configured to permanently occlude the target occlusion location in the swollen state. The hydrogel may be an electro-activated hydrogel (EAH) which could be ...

  4. SNS EXTRACTION KICKER SYSTEM AND FIRST ARTICLE BPFN TEST.

    Energy Technology Data Exchange (ETDEWEB)

    MI,J.; PAI,C.; DAVINO,D.; HAHN,H.; LAMBIASE,R.; LEE,Y.Y.; MENG,W.; SANDBERG,J.; TSOUPAS,N.; ZHANG,W.; WARBURTON,D.

    2002-06-03

    The Spallation Neutron Source (SNS) extraction kicker system brings the proton beam from the accumulator ring through a beam transfer line into the target area. The 14 kicker magnets are located in one straight section. The kicker magnets are energized by 14 Blumlein type Pulse Forming Networks (BPFN). The first article of the SNS extraction kicker BPFN was assembled and tested at this laboratory. This paper describes the kicker BPFN system arrangement and parameters. The first article BPFN design and its main components used are explained. High voltage BPFN test results and the load current waveform are illustrated in this paper. Temperature measurements of the kicker ferrite blocks at full power showed only small or no heating. This paper discusses the modifications to the BPFN design, such as a saturating inductor and 25 Q termination, to minimize the transverse coupling impedance.

  5. Inhibition Effect of Mace Extract Microemulsion on Vitamin C Photooxidation in Aqueous Systems

    Directory of Open Access Journals (Sweden)

    Hasbullah Hasbullah

    2014-01-01

    Full Text Available Photooxidation in food systems cause nutritional losses and produces undesirable flavor, toxic and color compounds, which make foods less acceptable or unacceptable to consumers. The objective of this research was to know the effectiveness of mace extract microemulsion to inhibit vitamin C photooxidation in aqueous systems. Aqueous food systems used are both beverage model system and apple juice beverage, where in each system enriched by 100 ppm vitamin C as substrate and 20 ppm erytrosin as photosensitiser. It is about one percent and two percent of microemulsion that contain mace extract of 0, 500 and 750 ppm were added into each of aqueous food system. Inhibition effect of mace extract microemulsion toward vitamin C photooxidation based on the rate of vitamin C degradation in aqueous food systems that illuminated by fluorescent light with 2000 lux intensity within eight hours. The result indicated the mace extract microemulsion has anti-photooxidation activity and ability to inhibit vitamin C photooxidation in aqueous systems.

  6. Curvelet Transform-Based Denoising Method for Doppler Frequency Extraction

    Institute of Scientific and Technical Information of China (English)

    HOU Shu-juan; WU Si-liang

    2007-01-01

    A novel image denoising method based on curvelet transform is proposed in order to improve the performance of Doppler frequency extraction in low signal-noise-ratio (SNR) environment. The echo can be represented as a gray image with spectral intensity as its gray values by time-frequency transform. And the curvelet coefficients of the image are computed. Then an adaptive soft-threshold scheme based on dual-median operation is implemented in curvelet domain. After that, the image is reconstructed by inverse curvelet transform and the Doppler curve is extracted by a curve detection scheme. Experimental results show the proposed method can improve the detection of Doppler frequency in low SNR environment.

  7. Hierarchical graph-based segmentation for extracting road networks from high-resolution satellite images

    Science.gov (United States)

    Alshehhi, Rasha; Marpu, Prashanth Reddy

    2017-04-01

    Extraction of road networks in urban areas from remotely sensed imagery plays an important role in many urban applications (e.g. road navigation, geometric correction of urban remote sensing images, updating geographic information systems, etc.). It is normally difficult to accurately differentiate road from its background due to the complex geometry of the buildings and the acquisition geometry of the sensor. In this paper, we present a new method for extracting roads from high-resolution imagery based on hierarchical graph-based image segmentation. The proposed method consists of: 1. Extracting features (e.g., using Gabor and morphological filtering) to enhance the contrast between road and non-road pixels, 2. Graph-based segmentation consisting of (i) Constructing a graph representation of the image based on initial segmentation and (ii) Hierarchical merging and splitting of image segments based on color and shape features, and 3. Post-processing to remove irregularities in the extracted road segments. Experiments are conducted on three challenging datasets of high-resolution images to demonstrate the proposed method and compare with other similar approaches. The results demonstrate the validity and superior performance of the proposed method for road extraction in urban areas.

  8. Compressive sensing-based feature extraction for bearing fault diagnosis using a heuristic neural network

    Science.gov (United States)

    Yuan, Haiying; Wang, Xiuyu; Sun, Xun; Ju, Zijian

    2017-06-01

    Bearing fault diagnosis collects massive amounts of vibration data about a rotating machinery system, whose fault classification largely depends on feature extraction. Features reflecting bearing work states are directly extracted using time-frequency analysis of vibration signals, which leads to high dimensional feature data. To address the problem of feature dimension reduction, a compressive sensing-based feature extraction algorithm is developed to construct a concise fault feature set. Next, a heuristic PSO-BP neural network, whose learning process perfectly combines particle swarm optimization and the Levenberg-Marquardt algorithm, is constructed for fault classification. Numerical simulation experiments are conducted on four datasets sampled under different severity levels and load conditions, which verify that the proposed fault diagnosis method achieves efficient feature extraction and high classification accuracy.

  9. Extraction of L-Aspartic Acid with Reverse Micelle System

    Directory of Open Access Journals (Sweden)

    Özlem AYDOĞAN

    2009-02-01

    Full Text Available The aim of this study is to investigate the extraction L-aspartic acid which is a hydrophobic amino acid with reverse micelle system. Production of amino acids by fermentation has been more important in recent years. These amino acids are obtained in dilute aqueous solutions and have to be separated from excess substrate, inorganic salts and by-products. Recently, separation of amino acids from fermentation media by reverse micelle extraction has received a great deal of attention. In this study, reverse micelle phase includes aliquat-336 as a surfactant, 1-decanol as a co-surfactant and isooctane as an apolar solvent. Experiments were performed at 150 rpm stirring rate, at 30 oC, for 30 min extraction time with equal volumes of reverse micelle and aqueous phases. Concentration of L-aspartic acid was analyzed by liquid chromatography (HPLC. The extraction yield increased with increasing pH and aliquat-336 concentration and with decreasing initial amino acid concentration. Maximum ekstraction yield (68 % was obtained at pH of 12, surfactant concentration of 200 mM and an initial amino acid concentration of 5 mM.

  10. Evaluating a Pivot-Based Approach for Bilingual Lexicon Extraction

    Directory of Open Access Journals (Sweden)

    Jae-Hoon Kim

    2015-01-01

    Full Text Available A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed.

  11. Facial Feature Extraction Method Based on Coefficients of Variances

    Institute of Scientific and Technical Information of China (English)

    Feng-Xi Song; David Zhang; Cai-Kou Chen; Jing-Yu Yang

    2007-01-01

    Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two popular feature ex- traction techniques in statistical pattern recognition field. Due to small sample size problem LDA cannot be directly applied to appearance-based face recognition tasks. As a consequence, a lot of LDA-based facial feature extraction techniques are proposed to deal with the problem one after the other. Nullspace Method is one of the most effective methods among them. The Nullspace Method tries to find a set of discriminant vectors which maximize the between-class scatter in the null space of the within-class scatter matrix. The calculation of its discriminant vectors will involve performing singular value decomposition on a high-dimensional matrix. It is generally memory- and time-consuming. Borrowing the key idea in Nullspace method and the concept of coefficient of variance in statistical analysis we present a novel facial feature extraction method, i.e., Discriminant based on Coefficient of Variance (DCV) in this paper. Experimental results performed on the FERET and AR face image databases demonstrate that DCV is a promising technique in comparison with Eigenfaces, Nullspace Method, and other state-of-the-art facial feature extraction methods.

  12. CD-REST: a system for extracting chemical-induced disease relation in literature

    Science.gov (United States)

    Xu, Jun; Wu, Yonghui; Zhang, Yaoyun; Wang, Jingqi; Lee, Hee-Jin; Xu, Hua

    2016-01-01

    Mining chemical-induced disease relations embedded in the vast biomedical literature could facilitate a wide range of computational biomedical applications, such as pharmacovigilance. The BioCreative V organized a Chemical Disease Relation (CDR) Track regarding chemical-induced disease relation extraction from biomedical literature in 2015. We participated in all subtasks of this challenge. In this article, we present our participation system Chemical Disease Relation Extraction SysTem (CD-REST), an end-to-end system for extracting chemical-induced disease relations in biomedical literature. CD-REST consists of two main components: (1) a chemical and disease named entity recognition and normalization module, which employs the Conditional Random Fields algorithm for entity recognition and a Vector Space Model-based approach for normalization; and (2) a relation extraction module that classifies both sentence-level and document-level candidate drug–disease pairs by support vector machines. Our system achieved the best performance on the chemical-induced disease relation extraction subtask in the BioCreative V CDR Track, demonstrating the effectiveness of our proposed machine learning-based approaches for automatic extraction of chemical-induced disease relations in biomedical literature. The CD-REST system provides web services using HTTP POST request. The web services can be accessed from http://clinicalnlptool.com/cdr. The online CD-REST demonstration system is available at http://clinicalnlptool.com/cdr/cdr.html. Database URL: http://clinicalnlptool.com/cdr; http://clinicalnlptool.com/cdr/cdr.html PMID:27016700

  13. The validation of forensic DNA extraction systems to utilize soil contaminated biological evidence.

    Science.gov (United States)

    Kasu, Mohaimin; Shires, Karen

    2015-07-01

    The production of full DNA profiles from biological evidence found in soil has a high failure rate due largely to the inhibitory substance humic acid (HA). Abundant in various natural soils, HA co-extracts with DNA during extraction and inhibits DNA profiling by binding to the molecular components of the genotyping assay. To successfully utilize traces of soil contaminated evidence, such as that found at many murder and rape crime scenes in South Africa, a reliable HA removal extraction system would often be selected based on previous validation studies. However, for many standard forensic DNA extraction systems, peer-reviewed publications detailing the efficacy on soil evidence is either lacking or is incomplete. Consequently, these sample types are often not collected or fail to yield suitable DNA material due to the use of unsuitable methodology. The aim of this study was to validate the common forensic DNA collection and extraction systems used in South Africa, namely DNA IQ, FTA elute and Nucleosave for processing blood and saliva contaminated with HA. A forensic appropriate volume of biological evidence was spiked with HA (0, 0.5, 1.5 and 2.5 mg/ml) and processed through each extraction protocol for the evaluation of HA removal using QPCR and STR-genotyping. The DNA IQ magnetic bead system effectively removed HA from highly contaminated blood and saliva, and generated consistently acceptable STR profiles from both artificially spiked samples and crude soil samples. This system is highly recommended for use on soil-contaminated evidence over the cellulose card-based systems currently being preferentially used for DNA sample collection. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/....

  15. Improvement of extraction system geometry with suppression of possible Penning discharge ignition

    Science.gov (United States)

    Delferrière, O.; Gobin, R.; Harrault, F.; Nyckees, S.; Tuske, O.

    2014-02-01

    During the past two years, a new ECR 2.45 GHz type ion source has been developed especially dedicated to intense light ion injector project like IPHI (Injecteur Proton Haute Intensité), IFMIF (International Fusion Materials Irradiation Facility), to reduce beam emittance at RFQ entrance by shortening the length of the LEBT. This new ALISES concept (Advanced Light Ion Source Extraction System) is based on the use of an additional LEBT short length solenoid very close to the extraction aperture. The fringe field of this new solenoid produces the needed magnetic field to create the ECR resonance in the plasma chamber. Such geometry allows first putting the solenoid at ground potential, while saving space in front of the extraction to move the first LEBT solenoid closer and focus earlier the intense extracted beam. During the commissioning of the source in 2011-2012, ALISES has produced about 20 mA extracted from a 6 mm diameter plasma extraction hole at 23 kV. But the magnetic configuration combined to the new extraction system geometry led to important Penning discharge conditions in the accelerator column. Lots of them have been eliminated by inserting glass pieces between electrodes to modify equipotential lines with unfavorable ExB vacuum zones where particles were produced and trapped. To study Penning discharge location, several 3D calculations have been performed with OPERA-3D/TOSCA code to simulate the possible production and trapping of electrons in the extraction system. The results obtained on different sources already built have shown very good agreement with sparks location observed experimentally on electrodes. The simulations results as well as experimental measurements are presented and solutions to prevent possible Penning discharge in future source geometries are established.

  16. EEG signal features extraction based on fractal dimension.

    Science.gov (United States)

    Finotello, Francesca; Scarpa, Fabio; Zanon, Mattia

    2015-01-01

    The spread of electroencephalography (EEG) in countless applications has fostered the development of new techniques for extracting synthetic and informative features from EEG signals. However, the definition of an effective feature set depends on the specific problem to be addressed and is currently an active field of research. In this work, we investigated the application of features based on fractal dimension to a problem of sleep identification from EEG data. We demonstrated that features based on fractal dimension, including two novel indices defined in this work, add valuable information to standard EEG features and significantly improve sleep identification performance.

  17. Key frame extraction based on spatiotemporal motion trajectory

    Science.gov (United States)

    Zhang, Yunzuo; Tao, Ran; Zhang, Feng

    2015-05-01

    Spatiotemporal motion trajectory can accurately reflect the changes of motion state. Motivated by this observation, this letter proposes a method for key frame extraction based on motion trajectory on the spatiotemporal slice. Different from the well-known motion related methods, the proposed method utilizes the inflexions of the motion trajectory on the spatiotemporal slice of all the moving objects. Experimental results show that although a similar performance is achieved in the single-objective screen, by comparing the proposed method to that achieved with the state-of-the-art methods based on motion energy or acceleration, the proposed method shows a better performance in a multiobjective video.

  18. Catalytic membrane reactor for tritium extraction system from He purge

    Energy Technology Data Exchange (ETDEWEB)

    Santucci, Alessia, E-mail: alessia.santucci@enea.it [ENEA for EUROfusion, Via E. Fermi 45, 00044 Frascati, Roma (Italy); Incelli, Marco [ENEA for EUROfusion, Via E. Fermi 45, 00044 Frascati, Roma (Italy); DEIM, University of Tuscia, Via del Paradiso 47, 01100 Viterbo (Italy); Sansovini, Mirko; Tosti, Silvano [ENEA for EUROfusion, Via E. Fermi 45, 00044 Frascati, Roma (Italy)

    2016-11-01

    Highlights: • In the HCBB blanket, the produced tritium is recovered by purging with helium; membrane technologies are able to separate tritium from helium. • The paper presents the results of two experimental campaigns. • In the first, a Pd–Ag diffuser for hydrogen separation is tested at several operating conditions. • In the second, the ability of a Pd–Ag membrane reactor for water decontamination is assessed by performing isotopic swamping and water gas shift reactions. - Abstract: In the Helium Cooled Pebble Bed (HCPB) blanket concept, the produced tritium is recovered purging the breeder with helium at low pressure, thus a tritium extraction system (TES) is foreseen to separate the produced tritium (which contains impurities like water) from the helium gas purge. Several R&D activities are running in parallel to experimentally identify most promising TES technologies: particularly, Pd-based membrane reactors (MR) are under investigation because of their large hydrogen selectivity, continuous operation capability, reliability and compactness. The construction and operation under DEMO relevant conditions (that presently foresee a He purge flow rate of about 10,000 Nm{sup 3}/h and a H{sub 2}/He ratio of 0.1%) of a medium scale MR is scheduled for next year, while presently preliminary experiments on a small scale reactor are performed to identify most suitable operative conditions and catalyst materials. This work presents the results of an experimental campaign carried out on a Pd-based membrane aimed at measuring the capability of this device in separating hydrogen from the helium. Many operative conditions have been investigated by considering different He/H{sub 2} feed flow ratios, several lumen pressures and reactor temperatures. Moreover, the performances of a membrane reactor (composed of a Pd–Ag tube having a wall thickness of about 113 μm, length 500 mm and diameter 10 mm) in processing the water contained in the purge gas have been

  19. Extraction and Spatio-temporal Analysis System of Criminal Cases Based on Web News%基于 Web 新闻的案(事)件抽取与时空分析系统

    Institute of Scientific and Technical Information of China (English)

    吴镇城; 卢毅敏

    2016-01-01

    News narration has been an important way to understand the social dynamics since ancient times ,in a big data era , because of the objectivity and authenticity of web news itself , it contains a large number of data value .For the rich, easy collection and other advantage of the Crime event information in Web news , this study developed an event extraction and Spatio -temporal analysis system based on the Web information .Capture the crime event information of Fuzhou on web news sites .Distinguish and analysis of the Web news, the method of support vector machine was used to classify the crime event text .The accuracy of multi classification was 75%. At last, do spatiotemporal extraction and spatiotemporal analysis of crime event .Take the drug crime event as an example;the kernel density method was used to predict the crime concentration from the extraction result and data from security department .The result shows that the overall gathering areas of drug crimes are located in the Chayuan police district and Xiangyuan police district .It is good to the public security department and social dynamic analysis in Fuzhou .%新闻,自古以来便是人们了解社会动态的重要途径,大数据时代,由于Web新闻自身所具有的客观性和真实性,其蕴含的数据价值凸显。针对新闻网站中案(事)件信息丰富、易采集等优点,研究开发一套基于Web新闻的案(事)件抽取与时空分析系统,抓取各个新闻网站对发生于福州的案(事)件相关信息的报道,对新闻信息进行判别清洗与解析,采用支持向量机进行案(事)件类别分类,多类别分类精度达75%,抽取经分类处理之后的案(事)件文本中的案(事)件时空信息并进行时空分析,以毒品案(事)件为例,将解析结果与公安毒品案(事)件分别做核密度估计,结果表明,福州毒品事件集中发生于茶园派出所和象园派出所等辖区。该系

  20. Tree kernel-based protein-protein interaction extraction from biomedical literature.

    Science.gov (United States)

    Qian, Longhua; Zhou, Guodong

    2012-06-01

    There is a surge of research interest in protein-protein interaction (PPI) extraction from biomedical literature. While most of the state-of-the-art PPI extraction systems focus on dependency-based structured information, the rich structured information inherent in constituent parse trees has not been extensively explored for PPI extraction. In this paper, we propose a novel approach to tree kernel-based PPI extraction, where the tree representation generated from a constituent syntactic parser is further refined using the shortest dependency path between two proteins derived from a dependency parser. Specifically, all the constituent tree nodes associated with the nodes on the shortest dependency path are kept intact, while other nodes are removed safely to make the constituent tree concise and precise for PPI extraction. Compared with previously used constituent tree setups, our dependency-motivated constituent tree setup achieves the best results across five commonly used PPI corpora. Moreover, our tree kernel-based method outperforms other single kernel-based ones and performs comparably with some multiple kernel ones on the most commonly tested AIMed corpus.

  1. Rare earth element enrichment using membrane based solvent extraction

    Science.gov (United States)

    Makertiharta, I. G. B. N.; Dharmawijaya, P. T.; Zunita, M.; Wenten, I. G.

    2017-01-01

    The chemical, catalytic, electrical, magnetic, and optical properties of rare earth elements are required in broad applications. Rare earth elements have similar physical and chemical properties thus it is difficult to separate one from each other. Rare earth element is relatively abundant in earth's crust but rarely occur in high concentrated deposits. Traditionally, ion-exchange and solvent extraction techniques have been developed to separate and purify single rare earth solutions or compounds. Recently, membrane starts to gain attention for rare earth separation by combining membrane and proven technologies such as solvent extraction. Membrane-based process offers selective, reliable, energy efficient and easy to scale up separation. During membrane-based separation process, one phase passes through membrane pores while the other phase is rejected. There is no direct mixing of two phases thus the solvent loss is very low. Membrane can also lower solvent physical properties requirement (viscosity, density) and backmixing, eliminate flooding phenomenon and provide large interfacial area for mass transfer. This paper will summarize research efforts in developing membrane technology for rare earth element separation. Special attention will be given to solvent extraction related process as the commonly used method for rare earth element separation. Furthermore, membrane configuration and its potentials will also be discussed.

  2. Region-Based Building Rooftop Extraction and Change Detection

    Science.gov (United States)

    Tian, J.; Metzlaff, L.; d'Angelo, P.; Reinartz, P.

    2017-09-01

    Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs) to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  3. REGION-BASED BUILDING ROOFTOP EXTRACTION AND CHANGE DETECTION

    Directory of Open Access Journals (Sweden)

    J. Tian

    2017-09-01

    Full Text Available Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  4. Motion feature extraction scheme for content-based video retrieval

    Science.gov (United States)

    Wu, Chuan; He, Yuwen; Zhao, Li; Zhong, Yuzhuo

    2001-12-01

    This paper proposes the extraction scheme of global motion and object trajectory in a video shot for content-based video retrieval. Motion is the key feature representing temporal information of videos. And it is more objective and consistent compared to other features such as color, texture, etc. Efficient motion feature extraction is an important step for content-based video retrieval. Some approaches have been taken to extract camera motion and motion activity in video sequences. When dealing with the problem of object tracking, algorithms are always proposed on the basis of known object region in the frames. In this paper, a whole picture of the motion information in the video shot has been achieved through analyzing motion of background and foreground respectively and automatically. 6-parameter affine model is utilized as the motion model of background motion, and a fast and robust global motion estimation algorithm is developed to estimate the parameters of the motion model. The object region is obtained by means of global motion compensation between two consecutive frames. Then the center of object region is calculated and tracked to get the object motion trajectory in the video sequence. Global motion and object trajectory are described with MPEG-7 parametric motion and motion trajectory descriptors and valid similar measures are defined for the two descriptors. Experimental results indicate that our proposed scheme is reliable and efficient.

  5. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  6. Extract Rules by Using Rough Set and Knowledge—Based NN

    Institute of Scientific and Technical Information of China (English)

    王士同; E.Scott; 等

    1998-01-01

    In this paper,rough set theory is used to extract roughly-correct inference rules from information systems.Based on this idea,the learning algorithm ERCR is presented.In order to refine the learned roughly-correct inference rules,the knowledge-based neural network is used.The method presented here sufficiently combines the advantages of rough set theory and neural network.

  7. Fourier transform profilometry based on mean envelope extraction

    Science.gov (United States)

    Zhang, Xiaoxuan; Huang, Shujun; Gao, Nan; Zhang, Zonghua

    2017-02-01

    Based on an image pre-processing algorithm, a three-dimensional (3D) object measurement method is proposed by combining time domain and frequency domain analysis. Firstly, extreme points of sinusoidal fringes under the disturbance of noise are accurately extracted. Secondly, mean envelope of the fringe is obtained through appropriate interpolation method and then removed. Thirdly, phase information is extracted by using specific filtering in Fourier spectrum of the pre-processed fringe pattern. Finally, simulated and experimental results show a good property of the proposed method in accuracy and measurement range. The proposed method can achieve 3D shape of objects having large slopes and/or discontinuous surfaces from one-shot acquisition by using color fringe projection technique and will have wide applications in the fields of fast measurement.

  8. Spindle extraction method for ISAR image based on Radon transform

    Science.gov (United States)

    Wei, Xia; Zheng, Sheng; Zeng, Xiangyun; Zhu, Daoyuan; Xu, Gaogui

    2015-12-01

    In this paper, a method of spindle extraction of target in inverse synthetic aperture radar (ISAR) image is proposed which depends on Radon Transform. Firstly, utilizing Radon Transform to detect all straight lines which are collinear with these line segments in image. Then, using Sobel operator to detect image contour. Finally, finding all intersections of each straight line and image contour, the two intersections which have maximum distance between them is the two ends of this line segment and the longest line segment of all line segments is spindle of target. According to the proposed spindle extraction method, one hundred simulated ISAR images which are respectively rotated 0 degrees, 10 degrees, 20 degrees, 30 degrees and 40 degrees in counterclockwise are used to do experiment and the proposed method and the detection results are more close to the real spindle of target than the method based on Hough Transform .

  9. Ground extraction from airborne laser data based on wavelet analysis

    Science.gov (United States)

    Xu, Liang; Yang, Yan; Jiang, Bowen; Li, Jia

    2007-11-01

    With the advantages of high resolution and accuracy, airborne laser scanning data are widely used in topographic mapping. In order to generate a DTM, measurements from object features such as buildings, vehicles and vegetation have to be classified and removed. However, the automatic extraction of bare earth from point clouds acquired by airborne laser scanning equipment remains a problem in LIDAR data filtering nowadays. In this paper, a filter algorithm based on wavelet analysis is proposed. Relying on the capability of detecting discontinuities of continuous wavelet transform and the feature of multi-resolution analysis, the object points can be removed, while ground data are preserved. In order to evaluate the performance of this approach, we applied it to the data set used in the ISPRS filter test in 2003. 15 samples have been tested by the proposed approach. Results showed that it filtered most of the objects like vegetation and buildings, and extracted a well defined ground model.

  10. Modification of evidence theory based on feature extraction

    Institute of Scientific and Technical Information of China (English)

    DU Feng; SHI Wen-kang; DENG Yong

    2005-01-01

    Although evidence theory has been widely used in information fusion due to its effectiveness of uncertainty reasoning, the classical DS evidence theory involves counter-intuitive behaviors when high conflict information exists. Many modification methods have been developed which can be classified into the following two kinds of ideas, either modifying the combination rules or modifying the evidence sources. In order to make the modification more reasonable and more effective, this paper gives a thorough analysis of some typical existing modification methods firstly, and then extracts the intrinsic feature of the evidence sources by using evidence distance theory. Based on the extracted features, two modified plans of evidence theory according to the corresponding modification ideas have been proposed. The results of numerical examples prove the good performance of the plans when combining evidence sources with high conflict information.

  11. Features Extraction for Object Detection Based on Interest Point

    Directory of Open Access Journals (Sweden)

    Amin Mohamed Ahsan

    2013-05-01

    Full Text Available In computer vision, object detection is an essential process for further processes such as object tracking, analyzing and so on. In the same context, extraction features play important role to detect the object correctly. In this paper we present a method to extract local features based on interest point which is used to detect key-points within an image, then, compute histogram of gradient (HOG for the region surround that point. Proposed method used speed-up robust feature (SURF method as interest point detector and exclude the descriptor. The new descriptor is computed by using HOG method. The proposed method got advantages of both mentioned methods. To evaluate the proposed method, we used well-known dataset which is Caltech101. The initial result is encouraging in spite of using a small data for training.

  12. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  13. Remote Sensing Image Feature Extracting Based Multiple Ant Colonies Cooperation

    Directory of Open Access Journals (Sweden)

    Zhang Zhi-long

    2014-02-01

    Full Text Available This paper presents a novel feature extraction method for remote sensing imagery based on the cooperation of multiple ant colonies. First, multiresolution expression of the input remote sensing imagery is created, and two different ant colonies are spread on different resolution images. The ant colony in the low-resolution image uses phase congruency as the inspiration information, whereas that in the high-resolution image uses gradient magnitude. The two ant colonies cooperate to detect features in the image by sharing the same pheromone matrix. Finally, the image features are extracted on the basis of the pheromone matrix threshold. Because a substantial amount of information in the input image is used as inspiration information of the ant colonies, the proposed method shows higher intelligence and acquires more complete and meaningful image features than those of other simple edge detectors.

  14. Location Fingerprint Extraction for Magnetic Field Magnitude Based Indoor Positioning

    Directory of Open Access Journals (Sweden)

    Wenhua Shao

    2016-01-01

    Full Text Available Smartphone based indoor positioning has greatly helped people in finding their positions in complex and unfamiliar buildings. One popular positioning method is by utilizing indoor magnetic field, because this feature is stable and infrastructure-free. In this method, the magnetometer embedded on the smartphone measures indoor magnetic field and queries its position. However, the environments of the magnetometer are rather harsh. This harshness mainly consists of coarse-grained hard/soft-iron calibrations and sensor electronic noise. The two kinds of interferences decrease the position distinguishability of the magnetic field. Therefore, it is important to extract location features from magnetic fields to reduce these interferences. This paper analyzes the main interference sources of the magnetometer embedded on the smartphone. In addition, we present a feature distinguishability measurement technique to evaluate the performance of different feature extraction methods. Experiments revealed that selected fingerprints will improve position distinguishability.

  15. Work extraction and thermodynamics for individual quantum systems

    Science.gov (United States)

    Skrzypczyk, Paul; Short, Anthony J.; Popescu, Sandu

    2014-06-01

    Thermodynamics is traditionally concerned with systems comprised of a large number of particles. Here we present a framework for extending thermodynamics to individual quantum systems, including explicitly a thermal bath and work-storage device (essentially a ‘weight’ that can be raised or lowered). We prove that the second law of thermodynamics holds in our framework, and gives a simple protocol to extract the optimal amount of work from the system, equal to its change in free energy. Our results apply to any quantum system in an arbitrary initial state, in particular including non-equilibrium situations. The optimal protocol is essentially reversible, similar to classical Carnot cycles, and indeed, we show that it can be used to construct a quantum Carnot engine.

  16. Performance of a demand controlled mechanical extract ventilation system for dwellings

    Directory of Open Access Journals (Sweden)

    I. Pollet

    2013-10-01

    Full Text Available The main aim of ventilation is to guarantee a good indoor air quality, related to the energy consumed for heating and fan(s. Active or passive heat recovery systems seem to focus on the reduction of heating consumption at the expense of fan electricity consumption and maintenance. In this study, demandcontrolled mechanical extract ventilation systems of Renson (DCV1 and DCV2, based on natural supply in the habitable rooms and mechanical extraction in the wet rooms (or even the bedrooms, was analysed for one year by means of multi-zone Contam simulations on a reference detached house and compared with standard MEV and mechanical extract ventilation systems with heat recovery (MVHR. To this end, IAQ, total energy consumption, CO2 emissions and total cost of the systems are determined. The results show that DCV systems with increased supply air flow rates or direct mechanical extract from bedrooms can significantly improve IAQ, while reducing total energy consumption compared to MEV. Applying DCV reduces primary heating energy consumption and yearly fan electricity consumption at most by 65% to 50% compared to MEV. Total operational energy costs and CO2 emissions of DCV are similar when compared to MVHR. Total costs of DCV systems over 15 years are smaller when compared to MVHR due to lower investment and maintenance costs.

  17. Computerized lung nodule detection using 3D feature extraction and learning based algorithms.

    Science.gov (United States)

    Ozekes, Serhat; Osman, Onur

    2010-04-01

    In this paper, a Computer Aided Detection (CAD) system based on three-dimensional (3D) feature extraction is introduced to detect lung nodules. First, eight directional search was applied in order to extract regions of interests (ROIs). Then, 3D feature extraction was performed which includes 3D connected component labeling, straightness calculation, thickness calculation, determining the middle slice, vertical and horizontal widths calculation, regularity calculation, and calculation of vertical and horizontal black pixel ratios. To make a decision for each ROI, feed forward neural networks (NN), support vector machines (SVM), naive Bayes (NB) and logistic regression (LR) methods were used. These methods were trained and tested via k-fold cross validation, and results were compared. To test the performance of the proposed system, 11 cases, which were taken from Lung Image Database Consortium (LIDC) dataset, were used. ROC curves were given for all methods and 100% detection sensitivity was reached except naive Bayes.

  18. Nanofiltration, bipolar electrodialysis and reactive extraction hybrid system for separation of fumaric acid from fermentation broth.

    Science.gov (United States)

    Prochaska, Krystyna; Staszak, Katarzyna; Woźniak-Budych, Marta Joanna; Regel-Rosocka, Magdalena; Adamczak, Michalina; Wiśniewski, Maciej; Staniewski, Jacek

    2014-09-01

    A novel approach based on a hybrid system allowing nanofiltration, bipolar electrodialysis and reactive extraction, was proposed to remove fumaric acid from fermentation broth left after bioconversion of glycerol. The fumaric salts can be concentrated in the nanofiltration process to a high yield (80-95% depending on pressure), fumaric acid can be selectively separated from other fermentation components, as well as sodium fumarate can be conversed into the acid form in bipolar electrodialysis process (stack consists of bipolar and anion-exchange membranes). Reactive extraction with quaternary ammonium chloride (Aliquat 336) or alkylphosphine oxides (Cyanex 923) solutions (yield between 60% and 98%) was applied as the final step for fumaric acid recovery from aqueous streams after the membrane techniques. The hybrid system permitting nanofiltration, bipolar electrodialysis and reactive extraction was found effective for recovery of fumaric acid from the fermentation broth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    Science.gov (United States)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-05

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis.

  20. A hybrid system for emotion extraction from suicide notes.

    Science.gov (United States)

    Nikfarjam, Azadeh; Emadzadeh, Ehsan; Gonzalez, Graciela

    2012-01-01

    The reasons that drive someone to commit suicide are complex and their study has attracted the attention of scientists in different domains. Analyzing this phenomenon could significantly improve the preventive efforts. In this paper we present a method for sentiment analysis of suicide notes submitted to the i2b2/VA/Cincinnati Shared Task 2011. In this task the sentences of 900 suicide notes were labeled with the possible emotions that they reflect. In order to label the sentence with emotions, we propose a hybrid approach which utilizes both rule based and machine learning techniques. To solve the multi class problem a rule-based engine and an SVM model is used for each category. A set of syntactic and semantic features are selected for each sentence to build the rules and train the classifier. The rules are generated manually based on a set of lexical and emotional clues. We propose a new approach to extract the sentence's clauses and constitutive grammatical elements and to use them in syntactic and semantic feature generation. The method utilizes a novel method to measure the polarity of the sentence based on the extracted grammatical elements, reaching precision of 41.79 with recall of 55.03 for an f-measure of 47.50. The overall mean f-measure of all submissions was 48.75% with a standard deviation of 7%.

  1. Investigation on the extraction of strontium ions from aqueous phase using crown ether-ionic liquid systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The extraction of strontium ions using DCH18C6 as the extractant and various ionic liquids(ILs) as solvents has been investigated.The distribution ratio of Sr2+ can reach as high as 103 under certain conditions,much larger than that in DCH18C6/n-octanol system.The extraction capacity depends greatly on the structure of ionic liquids.In Ils-based extraction systems,the extraction efficiency of strontium ions is reduced by increasing the concentration of nitric acid and can also be influenced directly by the presence of Na+ and K+ in the aqueous phase.It is confirmed that the extraction proceeds mainly via a cation-exchange mechanism.

  2. Investigation on the extraction of strontium ions from aqueous phase using crown ether-ionic liquid systems

    Institute of Scientific and Technical Information of China (English)

    XU Chao; SHEN XingHai; CHEN QingDe; GAO HongCheng

    2009-01-01

    The extraction of strontium ions using DCH18C6 as the extractant and various ionic liquids (Ils) as solvents has been investigated.The distribution ratio of Sr~(2+) can reach as high as 10~3 under certain conditions,much larger than that in DCH18C6/n-octanol system.The extraction capacity depends greatly on the structure of ionic liquids.In Ils-based extraction systems,the extraction efficiency of strontium ions is reduced by increasing the concentration of nitric acid and can also be influenced directly by the presence of Na~+ and K~+ in the aqueous phase.It is confirmed that the extraction proceeds mainly via a cation-exchange mechanism.

  3. Development of blood extraction system designed by female mosquito's blood sampling mechanism for bio-MEMS

    Science.gov (United States)

    Tsuchiya, Kazuyoshi; Nakanishi, Naoyuki; Nakamachi, Eiji

    2005-02-01

    A compact and wearable wristwatch type Bio-MEMS such as a health monitoring system (HMS) to detect blood sugar level for diabetic patient, was newly developed. The HMS consists of (1) a indentation unit with a microneedle to generate the skin penetration force using a shape memory alloy(SMA) actuator, (2) a pumping unit using a bimorph PZT piezoelectric actuator to extract the blood and (3) a gold (Au) electrode as a biosensor immobilized GOx and attached to the gate electrode of MOSFET to detect the amount of Glucose in extracted blood. GOx was immobilized on a self assembled spacer combined with an Au electrode by the cross-link method using BSA as an additional bonding material. The device can extract blood in a few microliter through a painless microneedle with the negative pressure by deflection of the bimorph PZT piezoelectric actuator produced in the blood chamber, by the similar way the female mosquito extracts human blood with muscle motion to flex or relax. The performances of the liquid sampling ability of the pumping unit through a microneedle (3.8mm length, 100μm internal diameter) using the bimorph PZT piezoelectric microactuator were measured. The blood extraction micro device could extract human blood at the speed of 2μl/min, and it is enough volume to measure a glucose level, compared to the amount of commercial based glucose level monitor. The electrode embedded in the blood extraction device chamber could detect electrons generated by the hydrolysis of hydrogen peroxide produced by the reaction between GOx and glucose in a few microliter extracted blood, using the constant electric current measurement system of the MOSFET type hybrid biosensor. The output voltage for the glucose diluted in the chamber was increased lineally with increase of the glucose concentration.

  4. Ionic liquids for two-phase systems and their application for purification, extraction and biocatalysis.

    Science.gov (United States)

    Oppermann, Sebastian; Stein, Florian; Kragl, Udo

    2011-02-01

    The development of biotechnological processes using novel two-phase systems based on molten salts known as ionic liquids (ILs) got into the focus of interest. Many new approaches for the beneficial application of the interesting solvent have been published over the last years. ILs bring beneficial properties compared to organic solvents like nonflammability and nonvolatility. There are two possible ways to use the ILs: first, the hydrophobic ones as a substitute for organic solvents in pure two-phase systems with water and second, the hydrophilic ones in aqueous two-phase systems (ATPS). To effectively utilise IL-based two-phase systems or IL-based ATPS in biotechnology, extensive experimental work is required to gain the optimal system parameters to ensure selective extraction of the product of interest. This review will focus on the most actual findings dealing with the basic driving forces for the target extraction in IL-based ATPS as well as presenting some selected examples for the beneficial application of ILs as a substitute for organic solvents. Besides the research focusing on IL-based two-phase systems, the "green aspect" of ILs, due to their negligible vapour pressure, is widely discussed. We will present the newest results concerning ecotoxicity of ILs to get an overview of the state of the art concerning ILs and their utilisation in novel two-phase systems in biotechnology.

  5. Feature Extraction for Facial Expression Recognition based on Hybrid Face Regions

    Directory of Open Access Journals (Sweden)

    LAJEVARDI, S.M.

    2009-10-01

    Full Text Available Facial expression recognition has numerous applications, including psychological research, improved human computer interaction, and sign language translation. A novel facial expression recognition system based on hybrid face regions (HFR is investigated. The expression recognition system is fully automatic, and consists of the following modules: face detection, facial detection, feature extraction, optimal features selection, and classification. The features are extracted from both whole face image and face regions (eyes and mouth using log Gabor filters. Then, the most discriminate features are selected based on mutual information criteria. The system can automatically recognize six expressions: anger, disgust, fear, happiness, sadness and surprise. The selected features are classified using the Naive Bayesian (NB classifier. The proposed method has been extensively assessed using Cohn-Kanade database and JAFFE database. The experiments have highlighted the efficiency of the proposed HFR method in enhancing the classification rate.

  6. ILs-based microwave-assisted extraction coupled with aqueous two-phase for the extraction of useful compounds from Chinese medicine.

    Science.gov (United States)

    Lin, Xiao; Wang, Yuzhi; Liu, Xiaojie; Huang, Songyun; Zeng, Qun

    2012-09-07

    Ionic liquids-based microwave-assisted extraction (ILs-MAE) of medicinal or useful compounds from plants was investigated as an alternative to conventional organic solvent extractions. The extraction and the preconcentration of aqueous two-phase (ATP) systems have been integrated. Various operating parameters were systematically considered by single-factor and L(9) (3(4)) orthogonal array experiments. 1-Butyl-3-methylimidazolium tetrafluoroborate ([bmim][BF(4)]) has been selected to extract Apocynum venetum. The extract was then converted to the top phase by [bmim][BF(4)]/NaH(2)PO(4) system which was suitable for the preconcentration. Reversed phase high performance liquid chromatography (RP-HPLC) with ultraviolet detection was employed for the analysis of hyperin and isoquercitrin in Apocynum venetum. The optimal experiment approach could provide higher detection limit of hyperin and isoquercitrin which were 3.82 μg L(-1) and 3.00 μg L(-1) in Apocynum venetum. The recoveries of hyperin and isoquercitrin were 97.29% (RSD = 1.02%) and 99.40% (RSD = 1.13%), respectively, from aqueous samples of Apocynum venetum by the proposed method. Moreover, the extraction mechanism of ILs-MAE and the microstructures and chemical structures of the herb before and after extraction were also investigated. The method exhibited potential applicability with other complicated samples.

  7. Leaf Vein Extraction Based on Gray-scale Morphology

    Directory of Open Access Journals (Sweden)

    Xiaodong Zheng

    2010-12-01

    Full Text Available Leaf features play an important role in plant species identification and plant taxonomy. The type of the leaf vein is an important morphological feature of the leaf in botany. Leaf vein should be extracted from the leaf in the image before discriminating its type. In this paper a new method of leaf vein extraction has been proposed based on gray-scale morphology. Firstly, the color image of the plant leaf is transformed to the gray image according to the hue and intensity information. Secondly, the gray-scale morphology processing is applied to the image to eliminate the color overlap in the whole leaf vein and the whole background. Thirdly, the linear intensity adjustment is adopted to enlarge the gray value difference between the leaf vein and its background. Fourthly, calculate a threshold with OSTU method to segment the leaf vein from its background. Finally, the leaf vein can be got after some processing on details. Experiments have been conducted with several images. The results show the effectiveness of the method. The idea of the method is also applicable to other linear objects extraction.

  8. Birch Bark Dry Extract by Supercritical Fluid Technology: Extract Characterisation and Use for Stabilisation of Semisolid Systems

    Directory of Open Access Journals (Sweden)

    Markus Armbruster

    2017-03-01

    Full Text Available Triterpene compounds like betulin, betulinic acid, erythrodiol, oleanolic acid and lupeol are known for many pharmacological effects. All these substances are found in the outer bark of birch. Apart from its pharmacological effects, birch bark extract can be used to stabilise semisolid systems. Normally, birch bark extract is produced for this purpose by extraction with organic solvents. Employing supercritical fluid technology, our aim was to develop a birch bark dry extract suitable for stabilisation of lipophilic gels with improved properties while avoiding the use of toxic solvents. With supercritical carbon dioxide, three different particle formation methods from supercritical solutions have been tested. First, particle deposition was performed from a supercritical solution in an expansion chamber. Second, the Rapid Expansion of Supercritical Solutions (RESS method was used for particle generation. Third, a modified RESS-procedure, forming the particles directly into the thereby gelated liquid, was developed. All three methods gave yields from 1% to 5.8%, depending on the techniques employed. The triterpene composition of the three extracts was comparable: all three gave more stable oleogels compared to the use of an extract obtained by organic solvent extraction. Characterizing the rheological behaviour of these gels, a faster gelling effect was seen together with a lower concentration of the extract required for the gel formation with the supercritical fluid (SCF-extracts. This confirms the superiority of the supercritical fluid produced extracts with regard to the oleogel forming properties.

  9. Spoken Language Understanding Systems for Extracting Semantic Information from Speech

    CERN Document Server

    Tur, Gokhan

    2011-01-01

    Spoken language understanding (SLU) is an emerging field in between speech and language processing, investigating human/ machine and human/ human communication by leveraging technologies from signal processing, pattern recognition, machine learning and artificial intelligence. SLU systems are designed to extract the meaning from speech utterances and its applications are vast, from voice search in mobile devices to meeting summarization, attracting interest from both commercial and academic sectors. Both human/machine and human/human communications can benefit from the application of SLU, usin

  10. An optimal control method for maximizing the efficiency of direct drive ocean wave energy extraction system.

    Science.gov (United States)

    Chen, Zhongxian; Yu, Haitao; Wen, Cheng

    2014-01-01

    The goal of direct drive ocean wave energy extraction system is to convert ocean wave energy into electricity. The problem explored in this paper is the design and optimal control for the direct drive ocean wave energy extraction system. An optimal control method based on internal model proportion integration differentiation (IM-PID) is proposed in this paper though most of ocean wave energy extraction systems are optimized by the structure, weight, and material. With this control method, the heavy speed of outer heavy buoy of the energy extraction system is in resonance with incident wave, and the system efficiency is largely improved. Validity of the proposed optimal control method is verified in both regular and irregular ocean waves, and it is shown that IM-PID control method is optimal in that it maximizes the energy conversion efficiency. In addition, the anti-interference ability of IM-PID control method has been assessed, and the results show that the IM-PID control method has good robustness, high precision, and strong anti-interference ability.

  11. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  12. Procedure for extraction of disparate data from maps into computerized data bases

    Science.gov (United States)

    Junkin, B. G.

    1979-01-01

    A procedure is presented for extracting disparate sources of data from geographic maps and for the conversion of these data into a suitable format for processing on a computer-oriented information system. Several graphic digitizing considerations are included and related to the NASA Earth Resources Laboratory's Digitizer System. Current operating procedures for the Digitizer System are given in a simplified and logical manner. The report serves as a guide to those organizations interested in converting map-based data by using a comparable map digitizing system.

  13. Contour extraction of echocardiographic images based on pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana [Department of Multimedia, Faculty of Computer Science and Information Technology, Department of Computer and Communication Systems Engineering, Faculty of Engineering University Putra Malaysia 43400 Serdang, Selangor (Malaysia); Zamrin, D M [Department of Surgery, Faculty of Medicine, National University of Malaysia, 56000 Cheras, Kuala Lumpur (Malaysia); Saripan, M Iqbal

    2011-02-15

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  14. Phase shift extraction algorithm based on Euclidean matrix norm.

    Science.gov (United States)

    Deng, Jian; Wang, Hankun; Zhang, Desi; Zhong, Liyun; Fan, Jinping; Lu, Xiaoxu

    2013-05-01

    In this Letter, the character of Euclidean matrix norm (EMN) of the intensity difference between phase-shifting interferograms, which changes in sinusoidal form with the phase shifts, is presented. Based on this character, an EMN phase shift extraction algorithm is proposed. Both the simulation calculation and experimental research show that the phase shifts with high precision can be determined with the proposed EMN algorithm easily. Importantly, the proposed EMN algorithm will supply a powerful tool for the rapid calibration of the phase shifts.

  15. Automatic Foreground Extraction Based on Difference of Gaussian

    Directory of Open Access Journals (Sweden)

    Yubo Yuan

    2014-01-01

    Full Text Available A novel algorithm for automatic foreground extraction based on difference of Gaussian (DoG is presented. In our algorithm, DoG is employed to find the candidate keypoints of an input image in different color layers. Then, a keypoints filter algorithm is proposed to get the keypoints by removing the pseudo-keypoints and rebuilding the important keypoints. Finally, Normalized cut (Ncut is used to segment an image into several regions and locate the foreground with the number of keypoints in each region. Experiments on the given image data set demonstrate the effectiveness of our algorithm.

  16. Probabilistic contour extraction based on shape prior model

    Institute of Scientific and Technical Information of China (English)

    FAN Xin; LIANG De-qun

    2005-01-01

    Statistical shape prior model is employed to construct the dynamics in probabilistic contour estimation.By applying principal component analysis,plausible shape samples are efficiently generated to predict contour samples.Based on the shape-dependent dynamics and probabilistic image model,a particle filter is used to estimate the contour with a specific shape.Compared with the deterministic approach with shape information,the proposed method is simple yet more effective in extracting contours from images with shape variations and occlusion.

  17. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-08-09

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. VISUAL ATTENTION BASED KEYFRAMES EXTRACTION AND VIDEO SUMMARIZATION

    Directory of Open Access Journals (Sweden)

    P.Geetha

    2012-05-01

    Full Text Available Recent developments in digital video and drastic increase of internet use have increased the amount of people searching and watching videos online. In order to make the search of the videos easy, Summary of the video may be provided along with each video. The video summary provided thus should be effective so that the user would come to know the content of the video without having to watch it fully. The summary produced should consists of the key frames that effectively express the content and context of the video. This work suggests a method to extract key frames which express most of the information in the video. This is achieved by quantifying Visual attention each frame commands. Visual attention of each frame is quantified using a descriptor called Attention quantifier. This quantification of visual attention is based on the human attention mechanism that indicates color conspicuousness and the motion involved seek more attention. So based on the color conspicuousness and the motion involved each frame is given a Attention parameter. Based on the attention quantifier value the key frames are extracted and are summarized adaptively. This framework suggests a method to produces meaningful video summary.

  19. Dynamic intensity control system with RF-knockout slow-extraction in the HIMAC synchrotron

    Science.gov (United States)

    Sato, Shinji; Furukawa, Takuji; Noda, Koji

    2007-05-01

    We have developed a dynamic beam intensity control system intended for use in three-dimensional pencil-beam scanning irradiation. In this system, which controls the spill structure and intensities of the beams extracted from the synchrotron, the amplitude of the RF-knockout is controlled using a 10 kHz response. Its amplitude modulation function is generated based on an analytical one-dimensional model of the RF-knockout slow-extraction. Using feedback control of the proportional and integral controls, this system allows us to control the beam current dynamically in response to the request, as was experimentally verified in the HIMAC synchrotron. In this paper, we describe the system for controlling the amplitude modulation through the inclusion of feedback, and we provide some experimental results.

  20. The effects of citrus extract (Citrox©) on the naturally occurring microflora and inoculated pathogens, Bacillus cereus and Salmonella enterica, in a model food system and the traditional Greek yogurt-based salad Tzatziki.

    Science.gov (United States)

    Tsiraki, Maria I; Savvaidis, Ioannis N

    2016-02-01

    The antimicrobial effect of citrus extract (at 1 mL/kg [TC1] and 2 mL/kg [TC2]) on the naturally occurring microflora and inoculated pathogens (Bacillus cereus and Salmonella enterica, at ca. 6 log cfu/g) in the traditional Greek yogurt-based salad Tzatziki during storage under vacuum at 4 or 10 °C was examined. We also examined the effect of citrus extract (Citrox(©)) against the two aforementioned pathogens in tryptic soy broth (TSB). Of the two treatments, TC2 yielded the lowest yeast counts, irrespective of temperature, resulting in approximately 2 (4 °C) and 3 (10 °C) log reductions on the final day of storage (70 and 30 days, respectively). Although panelists preferred the TC1-treated salad, the TC2-treated product was sensorily acceptable. Therefore, at the concentrations used, Citrox had no negative sensorial effect on the Tzatziki. During storage, the Bacillus populations in the Citrox-treated Tzatziki samples progressively decreased, showing major declines from days 12 and 28 (at 10 and 4 °C, respectively). Citrox, especially at 2 mL/kg, had a significant effect on the survival of B. cereus. S. enterica showed major declines in all untreated Tzatziki samples from day 0-70 (4 °C) and from day 0-30 (10 °C), with averages of 2.5 and 2.8 log cfu/g, respectively. The results indicate that Citrox (at 1 and 2 mL/kg) is effective, from a safety standpoint, for reducing Bacillus and Salmonella spp. in Tzatziki. In addition, 2% citrus extract also showed a higher inhibitory effect against B. cereus and S. enterica grown in TSB than 1% citrus extract. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. EXTRACTION AND SORPTION BENZOIC ACID FROM AQUEOUS SOLUTIONS OF POLYMERS BASED ON N-VINYLAMIDES

    Directory of Open Access Journals (Sweden)

    A. G. Savvina

    2015-01-01

    Full Text Available The widespread use of aromatic acids (benzoic acid, salicylic as preservatives necessitates their qualitative and quantitative determination in food. Effective and common way to separation and concentration of aromatic acids liquid extraction. Biphasic system of water-soluble polymers based on (poly-N-vinyl pyrrolidone, and poly-N-vinylcaprolactam satisfy the requirements of the extraction system. When sorption concentration improved definition of the metrological characteristics, comply with the requirements for sensitivity and selectivity definition appears possible, use of inexpensive and readily available analytical equipment. When studying the adsorption of benzoic acid used as a sorbent crosslinked polymer based on N-vinyl pyrrolidone, obtained by radical polymerisation of a functional monomer and crosslinker. In the extraction of benzoic acid to maximize the allocation of water and the organic phase of the polymer used salt solutions with concentrations close to saturation. Regardless of the nature of the anion salt is used as salting-out agent, aromatic acids sorption increases with the size of the cations. In the experiment the maximum recovery rate (80% benzoic acid obtained in the PVP (0.2 weight%. Ammonium sulphate. The dependence stepepni benzoic acid extraction from time sorption sorbent mass and the pH of the aqueous phase. To establish equilibrium in the system, for 20 minutes. The dependence of the degree of extraction of the acid pH indicates that the acid is extracted into the molecular form. The maximum adsorption is reached at pH 3,5, with its efficiency decreases symbatically reduce the amount of undissociated acid molecules in solution.

  2. User-centered evaluation of Arizona BioPathway: an information extraction, integration, and visualization system.

    Science.gov (United States)

    Quiñones, Karin D; Su, Hua; Marshall, Byron; Eggers, Shauna; Chen, Hsinchun

    2007-09-01

    Explosive growth in biomedical research has made automated information extraction, knowledge integration, and visualization increasingly important and critically needed. The Arizona BioPathway (ABP) system extracts and displays biological regulatory pathway information from the abstracts of journal articles. This study uses relations extracted from more than 200 PubMed abstracts presented in a tabular and graphical user interface with built-in search and aggregation functionality. This paper presents a task-centered assessment of the usefulness and usability of the ABP system focusing on its relation aggregation and visualization functionalities. Results suggest that our graph-based visualization is more efficient in supporting pathway analysis tasks and is perceived as more useful and easier to use as compared to a text-based literature-viewing method. Relation aggregation significantly contributes to knowledge-acquisition efficiency. Together, the graphic and tabular views in the ABP Visualizer provide a flexible and effective interface for pathway relation browsing and analysis. Our study contributes to pathway-related research and biological information extraction by assessing the value of a multiview, relation-based interface that supports user-controlled exploration of pathway information across multiple granularities.

  3. An Adequate Approach to Image Retrieval Based on Local Level Feature Extraction

    Directory of Open Access Journals (Sweden)

    Sumaira Muhammad Hayat Khan

    2010-10-01

    Full Text Available Image retrieval based on text annotation has become obsolete and is no longer interesting for scientists because of its high time complexity and low precision in results. Alternatively, increase in the amount of digital images has generated an excessive need for an accurate and efficient retrieval system. This paper proposes content based image retrieval technique at a local level incorporating all the rudimentary features. Image undergoes the segmentation process initially and each segment is then directed to the feature extraction process. The proposed technique is also based on image?s content which primarily includes texture, shape and color. Besides these three basic features, FD (Fourier Descriptors and edge histogram descriptors are also calculated to enhance the feature extraction process by taking hold of information at the boundary. Performance of the proposed method is found to be quite adequate when compared with the results from one of the best local level CBIR (Content Based Image Retrieval techniques.

  4. Soil Vapor Extraction System Optimization, Transition, and Closure Guidance

    Energy Technology Data Exchange (ETDEWEB)

    Truex, Michael J.; Becker, Dave; Simon, Michelle A.; Oostrom, Martinus; Rice, Amy K.; Johnson, Christian D.

    2013-02-08

    Soil vapor extraction (SVE) is a prevalent remediation approach for volatile contaminants in the vadose zone. A diminishing rate of contaminant extraction over time is typically observed due to 1) diminishing contaminant mass, and/or 2) slow rates of removal for contamination in low-permeability zones. After a SVE system begins to show indications of diminishing contaminant removal rate, SVE performance needs to be evaluated to determine whether the system should be optimized, terminated, or transitioned to another technology to replace or augment SVE. This guidance specifically addresses the elements of this type of performance assessment. While not specifically presented, the approach and analyses in this guidance could also be applied at the onset of remediation selection for a site as a way to evaluate current or future impacts to groundwater from vadose zone contamination. The guidance presented here builds from existing guidance for SVE design, operation, optimization, and closure from the U.S. Environmental Protection Agency, U.S. Army Corps of Engineers, and the Air Force Center for Engineering and the Environment. The purpose of the material herein is to clarify and focus on the specific actions and decisions related to SVE optimization, transition, and/or closure.

  5. Adanced Recovery and Integrated Extraction System (ARIES). Preconceptual design report

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, T.O.; Bronson, M.C.; Dennison, D.K.; Flamm, B. [and others

    1996-09-01

    This document describes the preliminary conceptual design of the Advanced Recovery and Integrated Extraction System (ARIES). The ARIES is an overall processing system for the dismantlement of nuclear weapon primaries. The program will demonstrate dismantlement of nuclear weapons and retrieval of the plutonium into a form that is compatible with long-term storage and that is inspectable in an unclassified form appropriate for the application of traditional international safeguards. The purpose of the ARIES process is to receive weapon pits, disassemble them, and provide a product of either a plutonium metal button or plutonium oxide powder appropriately canned to meet all requirements for long-term storage. This demonstration is a 24-month program, with full operation planned during the last three-six months to gain confidence in the system`s flexibility and reliability. The ARIES system is modular in design to offer credible scaling and the ability to incorporate modifications or new concepts. This report describes the preconceptual design of each of the ARIES modules, as well as the integration of the overall system.

  6. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  7. Extracting features for power system vulnerability assessment from wide-area measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kamwa, I. [Hydro-Quebec, Varennes, PQ (Canada). IREQ; Pradhan, A.; Joos, G. [McGill Univ., Montreal, PQ (Canada)

    2006-07-01

    Many power systems now operate close to their stability limits as a result of deregulation. Some utilities have chosen to install phason measurement units (PMUs) to monitor power system dynamics. The synchronized phasors of different areas of power systems available through a wide-area measurement system (WAMS) are expected to provide an effective security assessment tool as well as a stabilizing control action for inter-area oscillations and a system protection scheme (SPS) to evade possible blackouts. This paper presented tool extracting features for vulnerability assessment from WAMS-data. A Fourier-transform based technique was proposed for monitoring inter-area oscillations. FFT, wavelet transform and curve fitting approaches were investigated to analyze oscillatory signals. A dynamic voltage stability prediction algorithm was proposed for control action. An integrated framework was then proposed to assess a power system through extracted features from WAMS-data on first swing stability, voltage stability and inter-area oscillations. The centre of inertia (COI) concept was applied to the angle of voltage phasor. Prony analysis was applied to filtered signals to extract the damping coefficients. The minimum post-fault voltage of an area was considered for voltage stability, and an algorithm was used to monitor voltage stability issues. A data clustering technique was applied to classify the features in a group for improved system visualization. The overall performance of the technique was examined using a 67-bus system with 38 PMUs. The method used to extract features from both frequency and time domain analysis was provided. The test power system was described. The results of 4 case studies indicated that adoption of the method will be beneficial for system operators. 13 refs., 2 tabs., 13 figs.

  8. Response Surface Optimization of Rotenone Using Natural Alcohol-Based Deep Eutectic Solvent as Additive in the Extraction Medium Cocktail

    Directory of Open Access Journals (Sweden)

    Zetty Shafiqa Othman

    2017-01-01

    Full Text Available Rotenone is a biopesticide with an amazing effect on aquatic life and insect pests. In Asia, it can be isolated from Derris species roots (Derris elliptica and Derris malaccensis. The previous study revealed the comparable efficiency of alcohol-based deep eutectic solvent (DES in extracting a high yield of rotenone (isoflavonoid to binary ionic liquid solvent system ([BMIM]OTf and organic solvent (acetone. Therefore, this study intends to analyze the optimum parameters (solvent ratio, extraction time, and agitation rate in extracting the highest yield of rotenone extract at a much lower cost and in a more environmental friendly method by using response surface methodology (RSM based on central composite rotatable design (CCRD. By using RSM, linear polynomial equations were obtained for predicting the concentration and yield of rotenone extracted. The verification experiment confirmed the validity of both of the predicted models. The results revealed that the optimum conditions for solvent ratio, extraction time, and agitation rate were 2 : 8 (DES : acetonitrile, 19.34 hours, and 199.32 rpm, respectively. At the optimum condition of the rotenone extraction process using DES binary solvent system, this resulted in a 3.5-fold increase in a rotenone concentration of 0.49 ± 0.07 mg/ml and yield of 0.35 ± 0.06 (%, w/w as compared to the control extract (acetonitrile only. In fact, the rotenone concentration and yield were significantly influenced by binary solvent ratio and extraction time (P<0.05 but not by means of agitation rate. For that reason, the optimal extraction condition using alcohol-based deep eutectic solvent (DES as a green additive in the extraction medium cocktail has increased the potential of enhancing the rotenone concentration and yield extracted.

  9. Extracting Communities of Interests for Semantics-Based Graph Searches

    Science.gov (United States)

    Nakatsuji, Makoto; Tanaka, Akimichi; Uchiyama, Toshio; Fujimura, Ko

    Users recently find their interests by checking the contents published or mentioned by their immediate neighbors in social networking services. We propose semantics-based link navigation; links guide the active user to potential neighbors who may provide new interests. Our method first creates a graph that has users as nodes and shared interests as links. Then it divides the graph by link pruning to extract practical numbers, that the active user can navigate, of interest-sharing groups, i.e. communities of interests (COIs). It then attaches a different semantic tag to the link to each representative user, which best reflects the interests of COIs that they are included in, and to the link to each immediate neighbor of the active user. It finally calculates link attractiveness by analyzing the semantic tags on links. The active user can select the link to access by checking the semantic tags and link attractiveness. User interests extracted from large scale actual blog-entries are used to confirm the efficiency of our proposal. Results show that navigation based on link attractiveness and representative users allows the user to find new interests much more accurately than is otherwise possible.

  10. Object-based Analysis for Extraction of Dominant Tree Species

    Institute of Scientific and Technical Information of China (English)

    Meiyun; SHAO; Xia; JING; Lu; WANG

    2015-01-01

    As forest is of great significance for our whole development and the sustainable plan is so focus on it. It is very urgent for us to have the whole distribution,stock volume and other related information about that. So the forest inventory program is on our schedule. Aiming at dealing with the problem in extraction of dominant tree species,we tested the highly hot method-object-based analysis. Based on the ALOS image data,we combined multi-resolution in e Cognition software and fuzzy classification algorithm. Through analyzing the segmentation results,we basically extract the spruce,the pine,the birch and the oak of the study area. Both the spectral and spatial characteristics were derived from those objects,and with the help of GLCM,we got the differences of each species. We use confusion matrix to do the Classification accuracy assessment compared with the actual ground data and this method showed a comparatively good precision as 87% with the kappa coefficient 0. 837.

  11. Data Clustering Analysis Based on Wavelet Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    QIANYuntao; TANGYuanyan

    2003-01-01

    A novel wavelet-based data clustering method is presented in this paper, which includes wavelet feature extraction and cluster growing algorithm. Wavelet transform can provide rich and diversified information for representing the global and local inherent structures of dataset. therefore, it is a very powerful tool for clustering feature extraction. As an unsupervised classification, the target of clustering analysis is dependent on the specific clustering criteria. Several criteria that should be con-sidered for general-purpose clustering algorithm are pro-posed. And the cluster growing algorithm is also con-structed to connect clustering criteria with wavelet fea-tures. Compared with other popular clustering methods,our clustering approach provides multi-resolution cluster-ing results,needs few prior parameters, correctly deals with irregularly shaped clusters, and is insensitive to noises and outliers. As this wavelet-based clustering method isaimed at solving two-dimensional data clustering prob-lem, for high-dimensional datasets, self-organizing mapand U-matrlx method are applied to transform them intotwo-dimensional Euclidean space, so that high-dimensional data clustering analysis,Results on some sim-ulated data and standard test data are reported to illus-trate the power of our method.

  12. The use of some nanoemulsions based on aqueous propolis and lycopene extract in the skin's protective mechanisms against UVA radiation

    OpenAIRE

    Giuchici Camelia V; Butnariu Monica V

    2011-01-01

    Abstract Background The use of natural products based on aqueous extract of propolis and lycopene in the skin's protective mechanisms against UVA radiation was evaluated by means of experimental acute inflammation on rat paw edema. The aim of the present study was to evaluate the harmlessness of propolis - lycopene system through evaluation of skin level changes and anti-inflammatory action. The regenerative and protective effect of the aqueous propolis and lycopene extract is based on its ri...

  13. Liquid-phase extraction coupled with metal-organic frameworks-based dispersive solid phase extraction of herbicides in peanuts.

    Science.gov (United States)

    Li, Na; Wang, Zhibing; Zhang, Liyuan; Nian, Li; Lei, Lei; Yang, Xiao; Zhang, Hanqi; Yu, Aimin

    2014-10-01

    Liquid-phase extraction coupled with metal-organic frameworks-based dispersive solid phase extraction was developed and applied to the extraction of pesticides in high fatty matrices. The herbicides were ultrasonically extracted from peanut using ethyl acetate as extraction solvent. The separation of the analytes from a large amount of co-extractive fat was achieved by dispersive solid-phase extraction using MIL-101(Cr) as sorbent. In this step, the analytes were adsorbed on MIL-101(Cr) and the fat remained in bulk. The herbicides were separated and determined by high-performance liquid chromatography. The experimental parameters, including type and volume of extraction solvent, ultrasonication time, volume of hexane and eluting solvent, amount of MIL-101(Cr) and dispersive solid phase extraction time, were optimized. The limits of detection for herbicides range from 0.98 to 1.9 μg/kg. The recoveries of the herbicides are in the range of 89.5-102.7% and relative standard deviations are equal or lower than 7.0%. The proposed method is simple, effective and suitable for treatment of the samples containing high content of fat.

  14. Evaluation of degree of readsorption of radionuclides during sequential extraction in soil: comparison between batch and dynamic extraction systems

    DEFF Research Database (Denmark)

    Petersen, Roongrat; Hansen, Elo Harald; Hou, Xiaolin

    Sequential extraction techniques have been widely used to fractionate metals in solid samples (soils, sediments, solid wastes, etc.) due to their leachability. The results are useful for obtaining information about bioavailability, potential mobility and transport of element in natural environments...... developed in our laboratory for heavy metal fractionation has shown the reduction of readsorption problem in comparison with the batch techniques. Moreover, the system shows many advantages over the batch system such as speed of extraction, simple procedure, fully automatic, less risk of contamination....... However, the techniques have an important problem with redistribution as a result of readsorption of dissolved analytes onto the remaining solids phases during extraction. Many authors have demonstrated the readsorption problem and inaccuracy from it. In our previous work, a dynamic extraction system...

  15. The Data Extraction Using Distributed Crawler Inside Multi-Agent System

    Directory of Open Access Journals (Sweden)

    Karel Tomala

    2013-01-01

    Full Text Available The paper discusses the use of web crawler technology. We created an application based on standard web crawler. Our application is determined for data extraction. Primarily, the application was designed to extract data using keywords from a social network Twitter. First, we created a standard crawler, which went through a predefined list of URLs and gradually download page content of each of the URLs. Page content was then parsed and important text and metadata were stored in a database. Recently, the application was modified in to the form of the multi-agent system. The system was developed in the C# language, which is used to create web applications and sites etc. Obtained data was evaluated graphically. The system was created within Indect project at the VSB-Technical University of Ostrava.

  16. Performance Analysis of Vision-Based Deep Web Data Extraction for Web Document Clustering

    OpenAIRE

    M. Lavanya; Usha Rani

    2013-01-01

    Web Data Extraction is a critical task by applying various scientific tools and in a broad range of application domains. To extract data from multiple web sites are becoming more obscure, as well to design of web information extraction systems becomes more complex and time-consuming. We also present in this paper so far various risks in web data extraction. Identifying data region from web is a noteworthy crisis for information extraction from the web page. In this paper, performance of visio...

  17. Evaluation of various solvent systems for lipid extraction from wet microalgal biomass and its effects on primary metabolites of lipid-extracted biomass.

    Science.gov (United States)

    Ansari, Faiz Ahmad; Gupta, Sanjay Kumar; Shriwastav, Amritanshu; Guldhe, Abhishek; Rawat, Ismail; Bux, Faizal

    2017-06-01

    Microalgae have tremendous potential to grow rapidly, synthesize, and accumulate lipids, proteins, and carbohydrates. The effects of solvent extraction of lipids on other metabolites such as proteins and carbohydrates in lipid-extracted algal (LEA) biomass are crucial aspects of algal biorefinery approach. An effective and economically feasible algae-based oil industry will depend on the selection of suitable solvent/s for lipid extraction, which has minimal effect on metabolites in lipid-extracted algae. In current study, six solvent systems were employed to extract lipids from dry and wet biomass of Scenedesmus obliquus. To explore the biorefinery concept, dichloromethane/methanol (2:1 v/v) was a suitable solvent for dry biomass; it gave 18.75% lipids (dry cell weight) in whole algal biomass, 32.79% proteins, and 24.73% carbohydrates in LEA biomass. In the case of wet biomass, in order to exploit all three metabolites, isopropanol/hexane (2:1 v/v) is an appropriate solvent system which gave 7.8% lipids (dry cell weight) in whole algal biomass, 20.97% proteins, and 22.87% carbohydrates in LEA biomass. Graphical abstract: Lipid extraction from wet microalgal biomass and biorefianry approach.

  18. FEATURE EXTRACTION OF BONES AND SKIN BASED ON ULTRASONIC SCANNING

    Institute of Scientific and Technical Information of China (English)

    Zheng Shuxian; Zhao Wanhua; Lu Bingheng; Zhao Zhao

    2005-01-01

    In the prosthetic socket design, aimed at the high cost and radiation deficiency caused by CT scanning which is a routine technique to obtain the cross-sectional image of the residual limb, a new ultrasonic scanning method is developed to acquire the bones and skin contours of the residual limb. Using a pig fore-leg as the scanning object, an overlapping algorithm is designed to reconstruct the 2D cross-sectional image, the contours of the bone and skin are extracted using edge detection algorithm and the 3D model of the pig fore-leg is reconstructed by using reverse engineering technology. The results of checking the accuracy of the image by scanning a cylinder work pieces show that the extracted contours of the cylinder are quite close to the standard circumference. So it is feasible to get the contours of bones and skin by ultrasonic scanning. The ultrasonic scanning system featuring no radiation and low cost is a kind of new means of cross section scanning for medical images.

  19. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  20. Web信息抽取系统的设计%Design of Web Information Extraction System

    Institute of Scientific and Technical Information of China (English)

    刘斌; 张晓婧

    2013-01-01

    In order to obtain the scattered information hidden in Web pages,Web information extraction system design.The system first uses a modified HITS algorithm for topic selection information collection; then the Web page's HTML document structure of the data pre-processing; Finally,based on the XPath DOM tree generation algorithm to obtain the absolute path is an XPath node marked expression,and use the XPath language with XSLT technology to write extraction rules,resulting in a structured database or XML file,to achieve the positioning and Web information extraction.Extraction through a shopping site experiments show that the extraction system works well,can achieve similar batch extract Web page.%为了获取分散Web页面中隐含信息,设计了Web信息抽取系统.该系统首先使用一种改进的HITS主题精选算法进行信息采集;然后对Web页面的HTML结构进行文档的数据预处理;最后,基于DOM树的XPath绝对路径生成算法来获取被标注结点的XPath表达式,并使用XPath语言结合XSLT技术来编写抽取规则,从而得到结构化的数据库或XML文件,实现了Web信息的定位和抽取.通过一个购物网站的抽取实验证明,该系统的抽取效果良好,可以实现相似Web页面的批量抽取.

  1. Research on the method of extracting DEM based on GBInSAR

    Science.gov (United States)

    Yue, Jianping; Yue, Shun; Qiu, Zhiwei; Wang, Xueqin; Guo, Leping

    2016-05-01

    Precise topographical information has a very important role in geology, hydrology, natural resources survey and deformation monitoring. The extracting DEM technology based on synthetic aperture radar interferometry (InSAR) obtains the three-dimensional elevation of the target area through the phase information of the radar image data. The technology has large-scale, high-precision, all-weather features. By changing track in the location of the ground radar system up and down, it can form spatial baseline. Then we can achieve the DEM of the target area by acquiring image data from different angles. Three-dimensional laser scanning technology can quickly, efficiently and accurately obtain DEM of target area, which can verify the accuracy of DEM extracted by GBInSAR. But research on GBInSAR in extracting DEM of the target area is a little. For lack of theory and lower accuracy problems in extracting DEM based on GBInSAR now, this article conducted research and analysis on its principle deeply. The article extracted the DEM of the target area, combined with GBInSAR data. Then it compared the DEM obtained by GBInSAR with the DEM obtained by three-dimensional laser scan data and made statistical analysis and normal distribution test. The results showed the DEM obtained by GBInSAR was broadly consistent with the DEM obtained by three-dimensional laser scanning. And its accuracy is high. The difference of both DEM approximately obeys normal distribution. It indicated that extracting the DEM of target area based on GBInSAR is feasible and provided the foundation for the promotion and application of GBInSAR.

  2. Machine learning based sample extraction for automatic speech recognition using dialectal Assamese speech.

    Science.gov (United States)

    Agarwalla, Swapna; Sarma, Kandarpa Kumar

    2016-06-01

    Automatic Speaker Recognition (ASR) and related issues are continuously evolving as inseparable elements of Human Computer Interaction (HCI). With assimilation of emerging concepts like big data and Internet of Things (IoT) as extended elements of HCI, ASR techniques are found to be passing through a paradigm shift. Oflate, learning based techniques have started to receive greater attention from research communities related to ASR owing to the fact that former possess natural ability to mimic biological behavior and that way aids ASR modeling and processing. The current learning based ASR techniques are found to be evolving further with incorporation of big data, IoT like concepts. Here, in this paper, we report certain approaches based on machine learning (ML) used for extraction of relevant samples from big data space and apply them for ASR using certain soft computing techniques for Assamese speech with dialectal variations. A class of ML techniques comprising of the basic Artificial Neural Network (ANN) in feedforward (FF) and Deep Neural Network (DNN) forms using raw speech, extracted features and frequency domain forms are considered. The Multi Layer Perceptron (MLP) is configured with inputs in several forms to learn class information obtained using clustering and manual labeling. DNNs are also used to extract specific sentence types. Initially, from a large storage, relevant samples are selected and assimilated. Next, a few conventional methods are used for feature extraction of a few selected types. The features comprise of both spectral and prosodic types. These are applied to Recurrent Neural Network (RNN) and Fully Focused Time Delay Neural Network (FFTDNN) structures to evaluate their performance in recognizing mood, dialect, speaker and gender variations in dialectal Assamese speech. The system is tested under several background noise conditions by considering the recognition rates (obtained using confusion matrices and manually) and computation time

  3. System and method for preparing near-surface heavy oil for extraction using microbial degradation

    Science.gov (United States)

    Busche, Frederick D.; Rollins, John B.; Noyes, Harold J.; Bush, James G.

    2011-04-12

    A system and method for enhancing the recovery of heavy oil in an oil extraction environment by feeding nutrients to a preferred microbial species (bacteria and/or fungi). A method is described that includes the steps of: sampling and identifying microbial species that reside in the oil extraction environment; collecting fluid property data from the oil extraction environment; collecting nutrient data from the oil extraction environment; identifying a preferred microbial species from the oil extraction environment that can transform the heavy oil into a lighter oil; identifying a nutrient from the oil extraction environment that promotes a proliferation of the preferred microbial species; and introducing the nutrient into the oil extraction environment.

  4. Transportation-related data bases extracted from the national index of energy and environmental data bases. Part II. Detailed data base descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Birss, E.W.; Yeh, J.W.

    1976-11-15

    Lawrence Livermore Laboratory (LLL) extracted a set of 135 transportation-related data bases from a computerized national index of energy and environmental data bases. LLL had produced the national index for the Division of Biomedical and Environmental Research of the Energy Research and Development Administration (ERDA). The detailed transportation-related data base descriptions presented are part of a LLL ongoing research contract with the Information Division of the Transportation Systems Center of the U. S. Department of Transportation (DOT/TSC).

  5. A Novel Technique for Shape Feature Extraction Using Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Dhanoa Jaspreet Singh

    2016-01-01

    Full Text Available With the advent of technology and multimedia information, digital images are increasing very quickly. Various techniques are being developed to retrieve/search digital information or data contained in the image. Traditional Text Based Image Retrieval System is not plentiful. Since it is time consuming as it require manual image annotation. Also, the image annotation differs with different peoples. An alternate to this is Content Based Image Retrieval (CBIR system. It retrieves/search for image using its contents rather the text, keywords etc. A lot of exploration has been compassed in the range of Content Based Image Retrieval (CBIR with various feature extraction techniques. Shape is a significant image feature as it reflects the human perception. Moreover, Shape is quite simple to use by the user to define object in an image as compared to other features such as Color, texture etc. Over and above, if applied alone, no descriptor will give fruitful results. Further, by combining it with an improved classifier, one can use the positive features of both the descriptor and classifier. So, a tryout will be made to establish an algorithm for accurate feature (Shape extraction in Content Based Image Retrieval (CBIR. The main objectives of this project are: (a To propose an algorithm for shape feature extraction using CBIR, (b To evaluate the performance of proposed algorithm and (c To compare the proposed algorithm with state of art techniques.

  6. Information extraction approaches to unconventional data sources for "Injury Surveillance System": the case of newspapers clippings.

    Science.gov (United States)

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Rahim, Yousif; Gregori, Dario

    2012-04-01

    Injury Surveillance Systems based on traditional hospital records or clinical data have the advantage of being a well established, highly reliable source of information for making an active surveillance on specific injuries, like choking in children. However, they suffer the drawback of delays in making data available to the analysis, due to inefficiencies in data collection procedures. In this sense, the integration of clinical based registries with unconventional data sources like newspaper articles has the advantage of making the system more useful for early alerting. Usage of such sources is difficult since information is only available in the form of free natural-language documents rather than structured databases as required by traditional data mining techniques. Information Extraction (IE) addresses the problem of transforming a corpus of textual documents into a more structured database. In this paper, on a corpora of Italian newspapers articles related to choking in children due to ingestion/inhalation of foreign body we compared the performance of three IE algorithms- (a) a classical rule based system which requires a manual annotation of the rules; (ii) a rule based system which allows for the automatic building of rules; (b) a machine learning method based on Support Vector Machine. Although some useful indications are extracted from the newspaper clippings, this approach is at the time far from being routinely implemented for injury surveillance purposes.

  7. Vision-Based Faint Vibration Extraction Using Singular Value Decomposition

    Directory of Open Access Journals (Sweden)

    Xiujun Lei

    2015-01-01

    Full Text Available Vibration measurement is important for understanding the behavior of engineering structures. Unlike conventional contact-type measurements, vision-based methodologies have attracted a great deal of attention because of the advantages of remote measurement, nonintrusive characteristic, and no mass introduction. It is a new type of displacement sensor which is convenient and reliable. This study introduces the singular value decomposition (SVD methods for video image processing and presents a vibration-extracted algorithm. The algorithms can successfully realize noncontact displacement measurements without undesirable influence to the structure behavior. SVD-based algorithm decomposes a matrix combined with the former frames to obtain a set of orthonormal image bases while the projections of all video frames on the basis describe the vibration information. By means of simulation, the parameters selection of SVD-based algorithm is discussed in detail. To validate the algorithm performance in practice, sinusoidal motion tests are performed. Results indicate that the proposed technique can provide fairly accurate displacement measurement. Moreover, a sound barrier experiment showing how the high-speed rail trains affect the sound barrier nearby is carried out. It is for the first time to be realized at home and abroad due to the challenge of measuring environment.

  8. Ionic Liquid-salt Aqueous Two-phase System, a Novel System for the Extraction of Abused Drugs

    Institute of Scientific and Technical Information of China (English)

    She Hong LI; Chi Yang HE; Hu Wei LIU; Ke An LI; Feng LIU

    2005-01-01

    A 1-butyl-3-methylimidazolium chloride-salt aqueous two-phase system was studied on extraction of abused drugs. The effects of sorts of salts, temperature, concentration of salt and drugs on system were investigated systematically. A satisfactory extraction efficiency of 93%was obtained for papaverine while that of morphine was 65%. The extraction mechanism was primarily discussed.

  9. Novel extraction strategy of ribosomal RNA and genomic DNA from cheese for PCR-based investigations.

    Science.gov (United States)

    Bonaïti, Catherine; Parayre, Sandrine; Irlinger, Françoise

    2006-03-15

    Cheese microorganisms, such as bacteria and fungi, constitute a complex ecosystem that plays a central role in cheeses ripening. The molecular study of cheese microbial diversity and activity is essential but the extraction of high quality nucleic acid may be problematic: the cheese samples are characterised by a strong buffering capacity which negatively influenced the yield of the extracted rRNA. The objective of this study is to develop an effective method for the direct and simultaneous isolation of yeast and bacterial ribosomal RNA and genomic DNA from the same cheese samples. DNA isolation was based on a protocol used for nucleic acids isolation from anaerobic digestor, without preliminary washing step with the combined use of the action of chaotropic agent (acid guanidinium thiocyanate), detergents (SDS, N-lauroylsarcosine), chelating agent (EDTA) and a mechanical method (bead beating system). The DNA purification was carried out by two washing steps of phenol-chloroform. RNA was isolated successfully after the second acid extraction step by recovering it from the phenolic phase of the first acid extraction. The novel method yielded pure preparation of undegraded RNA accessible for reverse transcription-PCR. The extraction protocol of genomic DNA and rRNA was applicable to complex ecosystem of different cheese matrices.

  10. Layering-based Breakpoint Handling in Contour Line Extraction

    Institute of Scientific and Technical Information of China (English)

    CHEN Dan; LONG Yi; CAI Jinhua

    2003-01-01

    This paper deals withthe automatic connection of contourlines extracted from a scanned browngeographical map. For the variety oftopographical elements contained on amap, the factors causing the interrup-tion of contour line are also multiform,which make the connection task verydifficult. On the basis of separatingthose elements always making the con-tours break and regarding them as ref-erent layers, a layering-based methodis presented. The purpose is to takeinto account property information (likeinclination and configuration) of con-tour lines when they come across otherdifferent symbols, such as gully, cliff,dry land and elevation annotation etc.In this paper, the authors propose thatit should be far more effective and di-rect to adopt different algorithmic op-erators to different factors than usingsingle one operator to all.

  11. An answer summarization method based on keyword extraction

    Directory of Open Access Journals (Sweden)

    Fan Qiaoqing

    2017-01-01

    Full Text Available In order to reduce the redundancy of answer summary generated from community q&a dataset without topic tags, we propose an answer summarization algorithm based on keyword extraction. We combine tf-idf with word vector to change the influence transferred ratio equation in TextRank. And then during summarizing, we take the ratio of the number of sentences containing any keyword to the total number of candidate sentences as an adaptive factor for AMMR. Meanwhile we reuse the scores of keywords generated by TextRank as a weight factor for sentence similarity computing. Experimental results show that the proposed answer summarization is better than the traditional MMR and AMMR.

  12. Transmission line icing prediction based on DWT feature extraction

    Science.gov (United States)

    Ma, T. N.; Niu, D. X.; Huang, Y. L.

    2016-08-01

    Transmission line icing prediction is the premise of ensuring the safe operation of the network as well as the very important basis for the prevention of freezing disasters. In order to improve the prediction accuracy of icing, a transmission line icing prediction model based on discrete wavelet transform (DWT) feature extraction was built. In this method, a group of high and low frequency signals were obtained by DWT decomposition, and were fitted and predicted by using partial least squares regression model (PLS) and wavelet least square support vector model (w-LSSVM). Finally, the final result of the icing prediction was obtained by adding the predicted values of the high and low frequency signals. The results showed that the method is effective and feasible in the prediction of transmission line icing.

  13. An image segmentation based method for iris feature extraction

    Institute of Scientific and Technical Information of China (English)

    XU Guang-zhu; ZHANG Zai-feng; MA Yi-de

    2008-01-01

    In this article, the local anomalistic blocks such ascrypts, furrows, and so on in the iris are initially used directly asiris features. A novel image segmentation method based onintersecting cortical model (ICM) neural network was introducedto segment these anomalistic blocks. First, the normalized irisimage was put into ICM neural network after enhancement.Second, the iris features were segmented out perfectly and wereoutput in binary image type by the ICM neural network. Finally,the fourth output pulse image produced by ICM neural networkwas chosen as the iris code for the convenience of real timeprocessing. To estimate the performance of the presentedmethod, an iris recognition platform was produced and theHamming Distance between two iris codes was computed tomeasure the dissimilarity between them. The experimentalresults in CASIA vl.0 and Bath iris image databases show thatthe proposed iris feature extraction algorithm has promisingpotential in iris recognition.

  14. Reverse micelles extraction of nattokinase: From model system to real system

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Nattokinase is a novel fibrinolytic enzyme, which is homologous to Subtilisin Carlsberg. In this paper, Subtilisin Carlsberg was taken as a model protein of nattokinase. Effects of pH, ionic strength, concentration of isopropanol on the extraction of Subtilisin Carlsberg with AOT/isooctane reverse micelles system were investigated. Further, the process of reverse micelles extraction of nattokinase from fermentation broth was studied. By taking the reverse micelles of AOT/isooctane as extractant to perform a full extraction cycle, it was found that about eighty percent of the total activity of nattokinase in the fermentation broth could be recovered and the purification factor was about 2.5. Homologous protein could be reasonably used as model protein of a target protein.

  15. Analytical and simulation studies for diode and triode ion beam extraction systems

    Institute of Scientific and Technical Information of China (English)

    M. M. Abdelrahman1; N. I. Basal; S. G. Zakhary

    2012-01-01

    This work is concerned with ion beam dynamics and compares the emittance to aberration ratios of two-and three-electrode extraction systems.The study is conducted with the aid of Version 7 of SIMION 3D ray-tracing software.The beam dependence on various parameters of the extraction systems is studied and the numerical results lead to qualitative conclusions.Ion beam characteristics using diode and triode extraction systems are investigated with the aid of the computer code SIMION 3 D,Version 7.0. The diode (two electrode extraction system) and triode (threeelectrode extraction,acceleration-deceleration system) extraction systems are designed and optimized with different geometric parameters of the electrode system,voltage applied to the extraction electrode,and plasma parameters inside the ion source chamber,as well as by the ion beam space charge.This work attempts to describe the importance of the acceleration-deceleration extraction system.It shows that besides an increase of the beam energy,the ion beam has lower emittance than the two-electrode extraction system.Ion beams of the highest quality are extracted whenever the half-angular divergence is minimum for which the perveance current intensity and the extraction gap have optimum value.Knowing the electron temperature of the plasma is necessary to determine plasma potential and the exact beam energy.

  16. Studying Sensing-Based Systems

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun

    2013-01-01

    Recent sensing-based systems involve a multitude of users, devices, and places. These types of systems challenge existing approaches for conducting valid system evaluations. Here, the author discusses such evaluation challenges and revisits existing system evaluation methodologies....

  17. Vaccine adverse event text mining system for extracting features from vaccine safety reports.

    Science.gov (United States)

    Botsis, Taxiarchis; Buttolph, Thomas; Nguyen, Michael D; Winiecki, Scott; Woo, Emily Jane; Ball, Robert

    2012-01-01

    To develop and evaluate a text mining system for extracting key clinical features from vaccine adverse event reporting system (VAERS) narratives to aid in the automated review of adverse event reports. Based upon clinical significance to VAERS reviewing physicians, we defined the primary (diagnosis and cause of death) and secondary features (eg, symptoms) for extraction. We built a novel vaccine adverse event text mining (VaeTM) system based on a semantic text mining strategy. The performance of VaeTM was evaluated using a total of 300 VAERS reports in three sequential evaluations of 100 reports each. Moreover, we evaluated the VaeTM contribution to case classification; an information retrieval-based approach was used for the identification of anaphylaxis cases in a set of reports and was compared with two other methods: a dedicated text classifier and an online tool. The performance metrics of VaeTM were text mining metrics: recall, precision and F-measure. We also conducted a qualitative difference analysis and calculated sensitivity and specificity for classification of anaphylaxis cases based on the above three approaches. VaeTM performed best in extracting diagnosis, second level diagnosis, drug, vaccine, and lot number features (lenient F-measure in the third evaluation: 0.897, 0.817, 0.858, 0.874, and 0.914, respectively). In terms of case classification, high sensitivity was achieved (83.1%); this was equal and better compared to the text classifier (83.1%) and the online tool (40.7%), respectively. Our VaeTM implementation of a semantic text mining strategy shows promise in providing accurate and efficient extraction of key features from VAERS narratives.

  18. [Realization of Heart Sound Envelope Extraction Implemented on LabVIEW Based on Hilbert-Huang Transform].

    Science.gov (United States)

    Tan, Zhixiang; Zhang, Yi; Zeng, Deping; Wang, Hua

    2015-04-01

    We proposed a research of a heart sound envelope extraction system in this paper. The system was implemented on LabVIEW based on the Hilbert-Huang transform (HHT). We firstly used the sound card to collect the heart sound, and then implemented the complete system program of signal acquisition, pretreatment and envelope extraction on LabVIEW based on the theory of HHT. Finally, we used a case to prove that the system could collect heart sound, preprocess and extract the envelope easily. The system was better to retain and show the characteristics of heart sound envelope, and its program and methods were important to other researches, such as those on the vibration and voice, etc.

  19. Ontology-Based Classification System Development Methodology

    Directory of Open Access Journals (Sweden)

    Grabusts Peter

    2015-12-01

    Full Text Available The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision tree-based classification systems has been researched. Using such methodologies, the classification accuracy in some cases can be improved.

  20. Linear systems a measurement based approach

    CERN Document Server

    Bhattacharyya, S P; Mohsenizadeh, D N

    2014-01-01

    This brief presents recent results obtained on the analysis, synthesis and design of systems described by linear equations. It is well known that linear equations arise in most branches of science and engineering as well as social, biological and economic systems. The novelty of this approach is that no models of the system are assumed to be available, nor are they required. Instead, a few measurements made on the system can be processed strategically to directly extract design values that meet specifications without constructing a model of the system, implicitly or explicitly. These new concepts are illustrated by applying them to linear DC and AC circuits, mechanical, civil and hydraulic systems, signal flow block diagrams and control systems. These applications are preliminary and suggest many open problems. The results presented in this brief are the latest effort in this direction and the authors hope these will lead to attractive alternatives to model-based design of engineering and other systems.

  1. Ontology-Based Classification System Development Methodology

    OpenAIRE

    2015-01-01

    The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision ...

  2. Auditory-model-based Feature Extraction Method for Mechanical Faults Diagnosis

    Institute of Scientific and Technical Information of China (English)

    LI Yungong; ZHANG Jinping; DAI Li; ZHANG Zhanyi; LIU Jie

    2010-01-01

    It is well known that the human auditory system possesses remarkable capabilities to analyze and identify signals. Therefore, it would be significant to build an auditory model based on the mechanism of human auditory systems, which may improve the effects of mechanical signal analysis and enrich the methods of mechanical faults features extraction. However the existing methods are all based on explicit senses of mathematics or physics, and have some shortages on distinguishing different faults, stability, and suppressing the disturbance noise, etc. For the purpose of improving the performances of the work of feature extraction, an auditory model, early auditory(EA) model, is introduced for the first time. This auditory model transforms time domain signal into auditory spectrum via bandpass filtering, nonlinear compressing, and lateral inhibiting by simulating the principle of the human auditory system. The EA model is developed with the Gammatone filterbank as the basilar membrane. According to the characteristics of vibration signals, a method is proposed for determining the parameter of inner hair cells model of EA model. The performance of EA model is evaluated through experiments on four rotor faults, including misalignment, rotor-to-stator rubbing, oil film whirl, and pedestal looseness. The results show that the auditory spectrum, output of EA model, can effectively distinguish different faults with satisfactory stability and has the ability to suppress the disturbance noise. Then, it is feasible to apply auditory model, as a new method, to the feature extraction for mechanical faults diagnosis with effect.

  3. The extraction of motion-onset VEP BCI features based on deep learning and compressed sensing.

    Science.gov (United States)

    Ma, Teng; Li, Hui; Yang, Hao; Lv, Xulin; Li, Peiyang; Liu, Tiejun; Yao, Dezhong; Xu, Peng

    2017-01-01

    Motion-onset visual evoked potentials (mVEP) can provide a softer stimulus with reduced fatigue, and it has potential applications for brain computer interface(BCI)systems. However, the mVEP waveform is seriously masked in the strong background EEG activities, and an effective approach is needed to extract the corresponding mVEP features to perform task recognition for BCI control. In the current study, we combine deep learning with compressed sensing to mine discriminative mVEP information to improve the mVEP BCI performance. The deep learning and compressed sensing approach can generate the multi-modality features which can effectively improve the BCI performance with approximately 3.5% accuracy incensement over all 11 subjects and is more effective for those subjects with relatively poor performance when using the conventional features. Compared with the conventional amplitude-based mVEP feature extraction approach, the deep learning and compressed sensing approach has a higher classification accuracy and is more effective for subjects with relatively poor performance. According to the results, the deep learning and compressed sensing approach is more effective for extracting the mVEP feature to construct the corresponding BCI system, and the proposed feature extraction framework is easy to extend to other types of BCIs, such as motor imagery (MI), steady-state visual evoked potential (SSVEP)and P300. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Hybrid Feature Extraction-based Approach for Facial Parts Representation and Recognition

    Science.gov (United States)

    Rouabhia, C.; Tebbikh, H.

    2008-06-01

    Face recognition is a specialized image processing which has attracted a considerable attention in computer vision. In this article, we develop a new facial recognition system from video sequences images dedicated to person identification whose face is partly occulted. This system is based on a hybrid image feature extraction technique called ACPDL2D (Rouabhia et al. 2007), it combines two-dimensional principal component analysis and two-dimensional linear discriminant analysis with neural network. We performed the feature extraction task on the eyes and the nose images separately then a Multi-Layers Perceptron classifier is used. Compared to the whole face, the results of simulation are in favor of the facial parts in terms of memory capacity and recognition (99.41% for the eyes part, 98.16% for the nose part and 97.25 % for the whole face).

  5. MutationFinder: a high-performance system for extracting point mutation mentions from text.

    Science.gov (United States)

    Caporaso, J Gregory; Baumgartner, William A; Randolph, David A; Cohen, K Bretonnel; Hunter, Lawrence

    2007-07-15

    Discussion of point mutations is ubiquitous in biomedical literature, and manually compiling databases or literature on mutations in specific genes or proteins is tedious. We present an open-source, rule-based system, MutationFinder, for extracting point mutation mentions from text. On blind test data, it achieves nearly perfect precision and a markedly improved recall over a baseline. MutationFinder, along with a high-quality gold standard data set, and a scoring script for mutation extraction systems have been made publicly available. Implementations, source code and unit tests are available in Python, Perl and Java. MutationFinder can be used as a stand-alone script, or imported by other applications. http://bionlp.sourceforge.net.

  6. Rapid antioxidant capacity screening in herbal extracts using a simple flow injection-spectrophotometric system.

    Science.gov (United States)

    Mrazek, Nookrai; Watla-iad, Kanchana; Deachathai, Suwanna; Suteerapataranon, Siripat

    2012-05-01

    A simple flow injection (FI)-spectrophotometric system for the screening of antioxidant capacity in herbal extracts was developed. The analysis was based on the color disappearance due to the scavenging of 2,2-diphenyl-1-picrylhydrazyl (DPPH) radical by antioxidant compounds. DPPH and ascorbic acid were used as reagent and antioxidant standard, respectively. Effects of the DPPH concentration, DPPH flow rate, and reaction coil length on sensitivity were studied. The optimized condition provided the linear range of 0.010-0.300mM ascorbic acid with less than 5%RSD(n=10). Detection limit and quantitation limit were 0.004 and 0.013mM, respectively. Comparison of antioxidant capacity in some herbal extracts determined by the FI system and a standard method was carried out and no significant difference was obtained.

  7. Moon base reactor system

    Science.gov (United States)

    Chavez, H.; Flores, J.; Nguyen, M.; Carsen, K.

    1989-01-01

    The objective of our reactor design is to supply a lunar-based research facility with 20 MW(e). The fundamental layout of this lunar-based system includes the reactor, power conversion devices, and a radiator. The additional aim of this reactor is a longevity of 12 to 15 years. The reactor is a liquid metal fast breeder that has a breeding ratio very close to 1.0. The geometry of the core is cylindrical. The metallic fuel rods are of beryllium oxide enriched with varying degrees of uranium, with a beryllium core reflector. The liquid metal coolant chosen was natural lithium. After the liquid metal coolant leaves the reactor, it goes directly into the power conversion devices. The power conversion devices are Stirling engines. The heated coolant acts as a hot reservoir to the device. It then enters the radiator to be cooled and reenters the Stirling engine acting as a cold reservoir. The engines' operating fluid is helium, a highly conductive gas. These Stirling engines are hermetically sealed. Although natural lithium produces a lower breeding ratio, it does have a larger temperature range than sodium. It is also corrosive to steel. This is why the container material must be carefully chosen. One option is to use an expensive alloy of cerbium and zirconium. The radiator must be made of a highly conductive material whose melting point temperature is not exceeded in the reactor and whose structural strength can withstand meteor showers.

  8. Analysis on the Physicochemical Properties of Ginkgo biloba Leaves after Enzymolysis Based Ultrasound Extraction and Soxhlet Extraction.

    Science.gov (United States)

    Zhang, Chang-Wei; Wang, Cheng-Zhang; Tao, Ran

    2016-01-15

    In this study, high performance liquid chromatography (HPLC), ultraviolet (UV), thermagravimetric analyzer (TGA), pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS), and scanning electron microscope (SEM) were used as measurement techniques, contents of chemical composition, pyrolytic products, thermal stability, morphological characterization of Ginkgo biloba leaves (GBL) acted as the index, and physicochemical properties of GBL after enzymolysis based ultrasound extraction (EBUE) and Soxhlet extraction were studied. The detection results of chemical composition revealed that contents of general flavone, soluble protein, soluble total sugar and protein in the GBL declined significantly after EBUE, and contents of polyprenols and crude fat obviously reduced as well after Soxhlet extraction. Py-GC-MS results indicated that total GC contents of micromolecules with carbon less than 12 from 54.0% before EBUE decline to 8.34% after EBUE. Total GC contents of long-chain fatty acids with carbon less than 20 from 43.0% before EBUE reduced to 27.0% after Soxhlet extraction. Thermal stability results showed that GBL after Soxhlet extraction was easier to decompose than GBL before EBUE. SEM results illustrated that surface structure of GBL was damaged severely after EBUE, compared with GBL before EBUE, while organic solvent extraction had little influence on the morphological characterization of GBL after Soxhlet extraction compared with GBL after EBUE.

  9. Analysis on the Physicochemical Properties of Ginkgo biloba Leaves after Enzymolysis Based Ultrasound Extraction and Soxhlet Extraction

    Directory of Open Access Journals (Sweden)

    Chang-Wei Zhang

    2016-01-01

    Full Text Available In this study, high performance liquid chromatography (HPLC, ultraviolet (UV, thermagravimetric analyzer (TGA, pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS, and scanning electron microscope (SEM were used as measurement techniques, contents of chemical composition, pyrolytic products, thermal stability, morphological characterization of Ginkgo biloba leaves (GBL acted as the index, and physicochemical properties of GBL after enzymolysis based ultrasound extraction (EBUE and Soxhlet extraction were studied. The detection results of chemical composition revealed that contents of general flavone, soluble protein, soluble total sugar and protein in the GBL declined significantly after EBUE, and contents of polyprenols and crude fat obviously reduced as well after Soxhlet extraction. Py-GC-MS results indicated that total GC contents of micromolecules with carbon less than 12 from 54.0% before EBUE decline to 8.34% after EBUE. Total GC contents of long-chain fatty acids with carbon less than 20 from 43.0% before EBUE reduced to 27.0% after Soxhlet extraction. Thermal stability results showed that GBL after Soxhlet extraction was easier to decompose than GBL before EBUE. SEM results illustrated that surface structure of GBL was damaged severely after EBUE, compared with GBL before EBUE, while organic solvent extraction had little influence on the morphological characterization of GBL after Soxhlet extraction compared with GBL after EBUE.

  10. Estimation of multiply digital process control system extractive distillation stability

    Directory of Open Access Journals (Sweden)

    V. S. Kudryashov

    2016-01-01

    Full Text Available An approach to stability analysis of digital control systems associated non-stationary object on the example of the rectification process. Object modeling with cross-connections and the control scheme of the described system, discrete transfer functions in the shift operators. The equations of connection for each output of the closed-loop system. To solve this problem developed an algorithm for estimating the margin of stability of multivariable digital control systems based on the discrete root criterion, comprising the following main stages: obtaining of the characteristic polynomial of the closed-loop system for each output; computation of eigenvalues of the system matrix in the state space to determine roots of the characteristic equation and the stability of the system; determination of the stability and margin of stability by the deviation of maximum module of the root from the boundary of the high variability. To obtain the characteristic polynomial of a as discrete models of controllers and channels of IP object-use the transfer function of the first order with transport delay. The simulation was performed at different parameters of the control object, which is characterized by a stable and an unstable state of the system. VA-den analysis of the numerical values of the roots and character of their location on the complex plane, which to you-water that the system is stable or unstable. To confirm the obtained results were calculated and presented dynamic characteristics of the closed-loop system under different conditions, which confirm the initial assessment, the root criterion. To determine the factor stability of multivariable digital systems is proposed to use the deviation of the maximum root of the characteristic equation from the stability boundary. The obtained results apply to the class of symmetric multivariable control objects. The approach to assessing the sustainability of multivariable system regulation can be effectively

  11. Neutral Complex Extraction and Synergistic Extraction of Macrolide Antibiotics

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the theory of reactive extraction, new solvent systems were developed to replace butylacetate for extraction of macrolide antibiotics (erythromycin, kitasamycin, spiramycin meleumycin etc.). A new neutral complex solvent extraction system, fatty alcohol-kerosene (marked by E1), was used for extraction of erythromycin, one of the macrolide antibiotics. The extraction equilibrium equation is obtained, and the extraction distribution is as follows The effects of several parameters on extraction equilibrium were investigated. Furthermore, a new synergistic extraction system (marked by E2) was developed, in which another solvent was used as synergistic agent to replace the diluent kerosene in the neutral complex extraction system. Based on these new extraction systems, an improved process for extraction of erythromycin was developed, showing remarkable advantages in technology and economics owing to its low solvent consumption of 3kg per billion unit compared with 9-10 for butylacetate. The recovery process of solvent from raffinate may be eliminated.

  12. Extraction and purification methods in downstream processing of plant-based recombinant proteins.

    Science.gov (United States)

    Łojewska, Ewelina; Kowalczyk, Tomasz; Olejniczak, Szymon; Sakowicz, Tomasz

    2016-04-01

    During the last two decades, the production of recombinant proteins in plant systems has been receiving increased attention. Currently, proteins are considered as the most important biopharmaceuticals. However, high costs and problems with scaling up the purification and isolation processes make the production of plant-based recombinant proteins a challenging task. This paper presents a summary of the information regarding the downstream processing in plant systems and provides a comprehensible overview of its key steps, such as extraction and purification. To highlight the recent progress, mainly new developments in the downstream technology have been chosen. Furthermore, besides most popular techniques, alternative methods have been described.

  13. Feature extraction from time domain acoustic signatures of weapons systems fire

    Science.gov (United States)

    Yang, Christine; Goldman, Geoffrey H.

    2014-06-01

    The U.S. Army is interested in developing algorithms to classify weapons systems fire based on their acoustic signatures. To support this effort, an algorithm was developed to extract features from acoustic signatures of weapons systems fire and applied to over 1300 signatures. The algorithm filtered the data using standard techniques then estimated the amplitude and time of the first five peaks and troughs and the location of the zero crossing in the waveform. The results were stored in Excel spreadsheets. The results are being used to develop and test acoustic classifier algorithms.

  14. GB-SAR Experiment On Deformation Extraction And System Error Analysis

    Science.gov (United States)

    Qu, Shibo; Wang, Yanping; Tan, Weixian; Hong, Wen

    2010-10-01

    Ground Based Synthetic Aperture Radar (GB-SAR) provides a new method to monitoring deformation in relative small region. In this paper, we present the GB-SAR imaging geometry and analyze the interferometric phase for the purpose of deformation monitoring. Deformation monitoring error sources are also analyzed through sensitivity equations, including frequency instability and its influence on interferometric phase and deformation extraction, incident angle and monitoring distance. At last, a deformation monitoring experiment is carry out using ASTRO (Advanced Scannable Two-dimensional Rail Observation system), a GB-SAR system constructed by Institute of Electronics Chinese Academy of Sciences (IECAS). The deformation monitoring results show good consistent with metal objects' movement.

  15. Extracting Superquadric-based Geon Description for 3D Object Recognition

    Institute of Scientific and Technical Information of China (English)

    XINGWeiwei; LIUWeibin; YUANBaozong

    2005-01-01

    Geons recognition is one key issue in developing 3D object recognition system based on Recognition by components (RBC) theory. In this paper, we present a novel approach for extracting superquadric-based geon description of 3D volumetric primitives from real shape data, which integrates the advantages of deformable superquadric models reconstruction and SVM-based classification. First, Real-coded genetic algorithm (RCGA) is used for superquadric fitting to 3D data and the quantitative parametric information is obtained; then a new sophisticated feature set is derived from superquadric parameters obtained for the next step; and SVM-based classification is proposed and implemented for geons recognition and the qualitative geometric information is obtained. Furthermore, the knowledge-based feedback of SVM network is introduced for improving the classification performance. Ex-perimental results obtained show that our approach is efficient and precise for extracting superquadric-based geon description from real shape data in 3D object recognition. The results are very encouraging and have significant benefit for developing the general 3D object recognition system.

  16. PCR-based typing of DNA extracted from cigarette butts.

    Science.gov (United States)

    Hochmeister, M N; Budowle, B; Jung, J; Borer, U V; Comey, C T; Dirnhofer, R

    1991-01-01

    Limited genetic marker information can be obtained from saliva by typing by conventional serological means. Thus, the application of PCR-based DNA typing methods was investigated as a potential approach for typing genetic markers in saliva. DNA was isolated from 200 cigarettes smoked by 10 different individuals (20 cigarettes per individual) and from 3 cigarette butts recovered from 2 crime scenes (adjudicated cases) using a Chelex 100 extraction procedure. The amount of recovered human DNA was quantified by slot-blot analysis and ranged from approximately less than 2-160 ng DNA per cigarette butt for the 200 samples, and 8 ng, 50 ng, and 100 ng for the cigarette butts from the adjudicated cases. The DNA was successfully amplified by the polymerase chain reaction (PCR) for the HLA-DQ alpha locus (99 out of 100 samples) as well as for the variable number of tandem repeat (VNTR) locus D1S80 (99 out of 100 samples). Amplification and typing of DNA was successful on all samples recovered from the crime scenes. The results suggest that PCR-based typing of DNA offers a potential method for genetically characterizing traces of saliva on cigarette butts.

  17. A high-efficiency cellular extraction system for biological proteomics

    OpenAIRE

    2015-01-01

    Recent developments in quantitative high-resolution mass spectrometry have led to significant improvements in the sensitivity and specificity of biochemical analyses of cellular reactions, protein-protein interactions, and small molecule drug discovery. These approaches depend on cellular proteome extraction that preserves native protein activities. Here, we systematically analyzed mechanical methods of cell lysis and physical protein extraction to identify those that maximize the extraction ...

  18. Validation of a DNA IQ-based extraction method for TECAN robotic liquid handling workstations for processing casework.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Fourney, Ron M

    2010-10-01

    A semi-automated DNA extraction process for casework samples based on the Promega DNA IQ™ system was optimized and validated on TECAN Genesis 150/8 and Freedom EVO robotic liquid handling stations configured with fixed tips and a TECAN TE-Shake™ unit. The use of an orbital shaker during the extraction process promoted efficiency with respect to DNA capture, magnetic bead/DNA complex washes and DNA elution. Validation studies determined the reliability and limitations of this shaker-based process. Reproducibility with regards to DNA yields for the tested robotic workstations proved to be excellent and not significantly different than that offered by the manual phenol/chloroform extraction. DNA extraction of animal:human blood mixtures contaminated with soil demonstrated that a human profile was detectable even in the presence of abundant animal blood. For exhibits containing small amounts of biological material, concordance studies confirmed that DNA yields for this shaker-based extraction process are equivalent or greater to those observed with phenol/chloroform extraction as well as our original validated automated magnetic bead percolation-based extraction process. Our data further supports the increasing use of robotics for the processing of casework samples. Crown Copyright © 2009. Published by Elsevier Ireland Ltd. All rights reserved.

  19. A vision-based approach for tramway rail extraction

    Science.gov (United States)

    Zwemer, Matthijs H.; van de Wouw, Dennis W. J. M.; Jaspers, Egbert; Zinger, Sveta; de With, Peter H. N.

    2015-03-01

    The growing traffic density in cities fuels the desire for collision assessment systems on public transportation. For this application, video analysis is broadly accepted as a cornerstone. For trams, the localization of tramway tracks is an essential ingredient of such a system, in order to estimate a safety margin for crossing traffic participants. Tramway-track detection is a challenging task due to the urban environment with clutter, sharp curves and occlusions of the track. In this paper, we present a novel and generic system to detect the tramway track in advance of the tram position. The system incorporates an inverse perspective mapping and a-priori geometry knowledge of the rails to find possible track segments. The contribution of this paper involves the creation of a new track reconstruction algorithm which is based on graph theory. To this end, we define track segments as vertices in a graph, in which edges represent feasible connections. This graph is then converted to a max-cost arborescence graph, and the best path is selected according to its location and additional temporal information based on a maximum a-posteriori estimate. The proposed system clearly outperforms a railway-track detector. Furthermore, the system performance is validated on 3,600 manually annotated frames. The obtained results are promising, where straight tracks are found in more than 90% of the images and complete curves are still detected in 35% of the cases.

  20. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  1. Extraction of alkaloids for NMR-based profiling

    DEFF Research Database (Denmark)

    Yilmaz, Ali; Nyberg, Nils; Jaroszewski, Jerzy W.

    2012-01-01

    A museum collection of Cinchona cortex samples (n = 117), from the period 1850–1950, were extracted with a mixture of chloroform-d1, methanol-d4, water-d2, and perchloric acid in the ratios 5:5:1:1. The extracts were directly analyzed using 1H NMR spectroscopy (600 MHz) and the spectra evaluated ...

  2. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  3. BEaST: brain extraction based on nonlocal segmentation technique.

    NARCIS (Netherlands)

    Eskildsen, S.F.; Coupe, P.; Fonov, V.; Manjon, J.V.; Leung, K.K.; Guizard, N.; Wassef, S.N.; Ostergaard, L.R.; Collins, D.L.; Olde Rikkert, M.

    2012-01-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a ne

  4. Cobra: A Content-Based Video Retrieval System

    NARCIS (Netherlands)

    Petkovic, M.; Jonker, W.

    2002-01-01

    An increasing number of large publicly available video libraries results in a demand for techniques that can manipulate the video data based on content. In this paper, we present a content-based video retrieval system called Cobra. The system supports automatic extraction and retrieval of high-level

  5. Simulation of ion beam extraction and focusing system

    Institute of Scientific and Technical Information of China (English)

    B. A. Soliman; M. M. Abdelrahman; A. G. Helal; F. W. Abdelsalam

    2011-01-01

    The characteristics of ion beam extraction and focused to a volume as small as possible were investigated with the aid of computer code SIMION 3D version 7. This has been used to evaluate the extraction characteristics (accel-decel system) to generate an

  6. Micro-Lid For Sealing Sample Reservoirs of micro-Extraction Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed µLid system is in effect an attempt to miniaturize an extraction system to a chip-cup system with integrated heaters capable extremley hot...

  7. Study on Extracting Rare Earth from Sulfate System by Long-Chain Fatty Acid

    Institute of Scientific and Technical Information of China (English)

    Xu Yanhui; Zhao Zengqi; Liu Quansheng

    2004-01-01

    The extraction of rare earths by long-chain fatty acid in kerosene from sulphate system was described.It was demonstrated from the experimental results that the ratio of kerosene: fatty acid: isooctanol = 55 : 30: 15 ( V/V),By the saturation capability method and the slope method, the extracted reaction mechanism of the extraction of rare earth was studied.It is shown that the extraction reaction conform to the cation exchange reaction mechanism.The extracted sequence of rare earth was determined in this system and it is shown that there is no tetrad effect and the position of yttrium is between lanthanum and cerium.

  8. Classification of Textures Using Filter Based Local Feature Extraction

    Directory of Open Access Journals (Sweden)

    Bocekci Veysel Gokhan

    2016-01-01

    Full Text Available In this work local features are used in feature extraction process in image processing for textures. The local binary pattern feature extraction method from textures are introduced. Filtering is also used during the feature extraction process for getting discriminative features. To show the effectiveness of the algorithm before the extraction process, three different noise are added to both train and test images. Wiener filter and median filter are used to remove the noise from images. We evaluate the performance of the method with Naïve Bayesian classifier. We conduct the comparative analysis on benchmark dataset with different filtering and size. Our experiments demonstrate that feature extraction process combine with filtering give promising results on noisy images.

  9. Simulation and optimization of continuous extractive fermentation with recycle system

    Science.gov (United States)

    Widjaja, Tri; Altway, Ali; Rofiqah, Umi; Airlangga, Bramantyo

    2017-05-01

    Extractive fermentation is continuous fermentation method which is believed to be able to substitute conventional fermentation method (batch). The recovery system and ethanol refinery will be easier. Continuous process of fermentation will make the productivity increase although the unconverted sugar in continuous fermentation is still in high concentration. In order to make this process more efficient, the recycle process was used. Increasing recycle flow will enhance the probability of sugar to be re-fermented. However, this will make ethanol enter fermentation column. As a result, the accumulated ethanol will inhibit the growth of microorganism. This research aims to find optimum conditions of solvent to broth ratio (S:B) and recycle flow to fresh feed ratio in order to produce the best yield and productivity. This study employed optimization by Hooke Jeeves method using Matlab 7.8 software. The result indicated that optimum condition occured in S: B=2.615 and R: F=1.495 with yield = 50.2439 %.

  10. Live Forensics – Extracting Credentials on Windows and Linux Systems

    Directory of Open Access Journals (Sweden)

    Liviu Itoafa

    2012-09-01

    Full Text Available

    ’Post-mortem’ analysis of a system can be greatly simplified if the correct information is gathered in the live analysis stage. In this paper I’ve described Windows’ data protection APIs available for developers, some simplified versions of the API (LSA Secrets, Protected Storage, different methods used by applications to store their passwords safely and comparisons between them.  As an example, I’ve built tools to dump passwords saved by browsers (Chrome, IE, Firefox and an extractor of the login password (if available from the registry. The basic concepts of how passwords may be stored apply to majority of applications that run on Windows and store passwords (protected or not and understanding this makes possible recovery of other credentials also (messaging software, mail clients ....

    On the Linux side, I’ve analyzed a general method of storing passwords – keyrings, and the methods adopted by Chrome browser, and built extraction command line tools for both of them, in the form of a python script and a C++ application.

  11. Live Forensics – Extracting Credentials on Windows and Linux Systems

    Directory of Open Access Journals (Sweden)

    Liviu Itoafa

    2012-09-01

    Full Text Available ’Post-mortem’ analysis of a system can be greatly simplified if the correct information is gathered in the live analysis stage. In this paper I’ve described Windows’ data protection APIs available for developers, some simplified versions of the API (LSA Secrets, Protected Storage, different methods used by applications to store their passwords safely and comparisons between them.  As an example, I’ve built tools to dump passwords saved by browsers (Chrome, IE, Firefox and an extractor of the login password (if available from the registry. The basic concepts of how passwords may be stored apply to majority of applications that run on Windows and store passwords (protected or not and understanding this makes possible recovery of other credentials also (messaging software, mail clients .... On the Linux side, I’ve analyzed a general method of storing passwords – keyrings, and the methods adopted by Chrome browser, and built extraction command line tools for both of them, in the form of a python script and a C++ application.

  12. Binary solvent extraction system and extraction time effects on phenolic antioxidants from kenaf seeds (Hibiscus cannabinus L.) extracted by a pulsed ultrasonic-assisted extraction.

    Science.gov (United States)

    Wong, Yu Hua; Lau, Hwee Wen; Tan, Chin Ping; Long, Kamariah; Nyam, Kar Lin

    2014-01-01

    The aim of this study was to determine the best parameter for extracting phenolic-enriched kenaf (Hibiscus cannabinus L.) seeds by a pulsed ultrasonic-assisted extraction. The antioxidant activities of ultrasonic-assisted kenaf seed extracts (KSE) were determined by a 2,2-diphenyl-1-picrylhydrazyl (DPPH) radical scavenging capacity assay, 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid) (ABTS) radical scavenging assay, β -carotene bleaching inhibition assay, and ferric reducing antioxidant power (FRAP) assay. Total phenolic content (TPC) and total flavonoid content (TFC) evaluations were carried out to determine the phenolic and flavonoid contents in KSE. The KSE from the best extraction parameter was then subjected to high performance liquid chromatography (HPLC) to quantify the phenolic compounds. The optimised extraction condition employed 80% ethanol for 15 min, with the highest values determined for the DPPH, ABTS, and FRAP assay. KSE contained mainly tannic acid (2302.20 mg/100 g extract) and sinapic acid (1198.22 mg/100 g extract), which can be used as alternative antioxidants in the food industry.

  13. Binary Solvent Extraction System and Extraction Time Effects on Phenolic Antioxidants from Kenaf Seeds (Hibiscus cannabinus L. Extracted by a Pulsed Ultrasonic-Assisted Extraction

    Directory of Open Access Journals (Sweden)

    Yu Hua Wong

    2014-01-01

    Full Text Available The aim of this study was to determine the best parameter for extracting phenolic-enriched kenaf (Hibiscus cannabinus L. seeds by a pulsed ultrasonic-assisted extraction. The antioxidant activities of ultrasonic-assisted kenaf seed extracts (KSE were determined by a 2,2-diphenyl-1-picrylhydrazyl (DPPH radical scavenging capacity assay, 2,2′-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid (ABTS radical scavenging assay, β-carotene bleaching inhibition assay, and ferric reducing antioxidant power (FRAP assay. Total phenolic content (TPC and total flavonoid content (TFC evaluations were carried out to determine the phenolic and flavonoid contents in KSE. The KSE from the best extraction parameter was then subjected to high performance liquid chromatography (HPLC to quantify the phenolic compounds. The optimised extraction condition employed 80% ethanol for 15 min, with the highest values determined for the DPPH, ABTS, and FRAP assay. KSE contained mainly tannic acid (2302.20 mg/100 g extract and sinapic acid (1198.22 mg/100 g extract, which can be used as alternative antioxidants in the food industry.

  14. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  15. Morphological operation based dense houses extraction from DSM

    Science.gov (United States)

    Li, Y.; Zhu, L.; Tachibana, K.; Shimamura, H.

    2014-08-01

    This paper presents a method of reshaping and extraction of markers and masks of the dense houses from the DSM based on mathematical morphology (MM). Houses in a digital surface model (DSM) are almost joined together in high-density housing areas, and most segmentation methods cannot completely separate them. We propose to label the markers of the buildings firstly and segment them into masks by watershed then. To avoid detecting more than one marker for a house or no marker at all due to its higher neighbour, the DSM is morphologically reshaped. It is carried out by a MM operation using the certain disk shape SE of the similar size to the houses. The sizes of the houses need to be estimated before reshaping. A granulometry generated by opening-by-reconstruction to the NDSM is proposed to detect the scales of the off-terrain objects. It is a histogram of the global volume of the top hats of the convex objects in the continuous scales. The obvious step change in the profile means that there are many objects of similar sizes occur at this scale. In reshaping procedure, the slices of the object are derived by morphological filtering at the detected continuous scales and reconstructed in pile as the dome. The markers are detected on the basis of the domes.

  16. PDGI-BASED REGULAR SWEPT SURFACE EXTRACTION FROM POINT CLOUD

    Institute of Scientific and Technical Information of China (English)

    LI Jiangxiong; KE Yinglin; LI An; ZHU Weidong

    2006-01-01

    A principal direction Gaussian image (PDGI)-based algorithm is proposed to extract the regular swept surface from point cloud. Firstly, the PDGI of the regular swept surface is constructed from point cloud, then the bounding box of the Gaussian sphere is uniformly partitioned into a number of small cubes (3D grids) and the PDGI points on the Gaussian sphere are associated with the corresponding 3D grids. Secondly, cluster analysis technique is used to sort out a group of 3D grids containing more PDGI points among the 3D grids. By the connected-region growing algorithm, the congregation point or the great circle is detected from the 3D grids. Thus the translational direction is determined by the congregation point and the direction of the rotational axis is determined by the great circle. In addition, the positional point of the rotational axis is obtained by the intersection of all the projected normal lines of the rotational surface on the plane being perpendicular to the estimated direction of the rotational axis. Finally, a pattern search method is applied to optimize the translational direction and the rotational axis. Some experiments are used to illustrate the feasibility of the above algorithm.

  17. Apriori and N-gram Based Chinese Text Feature Extraction Method

    Institute of Scientific and Technical Information of China (English)

    王晔; 黄上腾

    2004-01-01

    A feature extraction, which means extracting the representative words from a text, is an important issue in text mining field. This paper presented a new Apriori and N-gram based Chinese text feature extraction method, and analyzed its correctness and performance. Our method solves the question that the exist extraction methods cannot find the frequent words with arbitrary length in Chinese texts. The experimental results show this method is feasible.

  18. A robust infrared dim target detection method based on template filtering and saliency extraction

    Science.gov (United States)

    Wang, Wenguang; Li, Chenming; Shi, Jianing

    2015-11-01

    Dim target detection in infrared image with complex background and low signal-clutter ratio (SCR) is a significant and difficult task in the infrared target tracking system. A robust infrared dim target detection method based on template filtering and saliency extraction is proposed in this paper. The weighted gray map is obtained from the infrared image to highlight the target which is brighter than its neighbors and has weak correlation with its background. The target saliency map is then calculated by phase spectrum of Fourier Transform, so that the dim target detection could be converted to salient region extraction. The potential targets are finally extracted by combining the two maps. Moreover, position discrimination between targets in the two maps is used to exclude the false alarms and extract the targets. Experimental results on measured images indicate that our method is feasible, adaptable and robust in different backgrounds. The ROC (Receiver Operating Characteristic) curves obtained from the simulated images demonstrate the proposed method outperforms some existing typical methods in both detection rate and false alarm rate, for target detection with low SCR.

  19. Automatic extraction of semantic relations between medical entities: a rule based approach.

    Science.gov (United States)

    Ben Abacha, Asma; Zweigenbaum, Pierre

    2011-10-06

    Information extraction is a complex task which is necessary to develop high-precision information retrieval tools. In this paper, we present the platform MeTAE (Medical Texts Annotation and Exploration). MeTAE allows (i) to extract and annotate medical entities and relationships from medical texts and (ii) to explore semantically the produced RDF annotations. Our annotation approach relies on linguistic patterns and domain knowledge and consists in two steps: (i) recognition of medical entities and (ii) identification of the correct semantic relation between each pair of entities. The first step is achieved by an enhanced use of MetaMap which improves the precision obtained by MetaMap by 19.59% in our evaluation. The second step relies on linguistic patterns which are built semi-automatically from a corpus selected according to semantic criteria. We evaluate our system's ability to identify medical entities of 16 types. We also evaluate the extraction of treatment relations between a treatment (e.g. medication) and a problem (e.g. disease): we obtain 75.72% precision and 60.46% recall. According to our experiments, using an external sentence segmenter and noun phrase chunker may improve the precision of MetaMap-based medical entity recognition. Our pattern-based relation extraction method obtains good precision and recall w.r.t related works. A more precise comparison with related approaches remains difficult however given the differences in corpora and in the exact nature of the extracted relations. The selection of MEDLINE articles through queries related to known drug-disease pairs enabled us to obtain a more focused corpus of relevant examples of treatment relations than a more general MEDLINE query.

  20. Optimization of the beam extraction systems for the Linac4 H{sup −} ion source

    Energy Technology Data Exchange (ETDEWEB)

    Fink, D. A.; Lettry, J.; Scrivens, R.; Steyaert, D. [CERN, 1211 Geneva 23 (Switzerland); Midttun, Ø. [University of Oslo, P.O. Box 1048, 0316 Oslo (Norway); CERN, 1211 Geneva 23 (Switzerland); Valerio-Lizarraga, C. A. [Departamento de Investigación en Fisica, Universidad de Sonora, Hermosillo (Mexico); CERN, 1211 Geneva 23 (Switzerland)

    2015-04-08

    The development of the Linac 4 and its integration into CERN’s acceleration complex is part of the foreseen luminosity upgrade of the Large Hadron Collider (LHC). The goal is to inject a 160 MeV H{sup −} beam into the CERN PS Booster (PSB) in order to increase the beam brightness by a factor of 2 compared to the 50 MeV proton linac, Linac 2, that is currently in operation. The requirements for the ion source are a 45 keV H{sup −} beam of 80 mA intensity, 2 Hz repetition rate and 0.5 ms pulse length within a normalized rms-emittance of 0.25 mm· mrad. The previously installed beam extraction system has been designed for an H{sup −} ion beam intensity of 20 mA produced by an RF-volume source with an electron to H{sup −} ratio of up to 50. For the required intensity upgrades of the Linac4 ion source, a new beam extraction system is being produced and tested; it is optimized for a cesiated surface RF-source with a nominal beam current of 40 mA and an electron to H{sup −} ratio of 4. The simulations, based on the IBSIMU code, are presented. At the Brookhaven National Laboratory (BNL), a peak beam current of more than 100 mA was demonstrated with a magnetron H{sup −} source at an energy of 35 keV and a repetition rate of 2 Hz. A new extraction system is required to operate at an energy of 45 keV; simulation of a two stage extraction system dedicated to the magnetron is presented.

  1. Optimization of the beam extraction systems for the Linac4 H- ion source

    Science.gov (United States)

    Fink, D. A.; Lettry, J.; Midttun, Ø.; Scrivens, R.; Steyaert, D.; Valerio-Lizarraga, C. A.

    2015-04-01

    The development of the Linac 4 and its integration into CERN's acceleration complex is part of the foreseen luminosity upgrade of the Large Hadron Collider (LHC). The goal is to inject a 160 MeV H- beam into the CERN PS Booster (PSB) in order to increase the beam brightness by a factor of 2 compared to the 50 MeV proton linac, Linac 2, that is currently in operation. The requirements for the ion source are a 45 keV H- beam of 80 mA intensity, 2 Hz repetition rate and 0.5 ms pulse length within a normalized rms-emittance of 0.25 mm. mrad. The previously installed beam extraction system has been designed for an H- ion beam intensity of 20 mA produced by an RF-volume source with an electron to H- ratio of up to 50. For the required intensity upgrades of the Linac4 ion source, a new beam extraction system is being produced and tested; it is optimized for a cesiated surface RF-source with a nominal beam current of 40 mA and an electron to H- ratio of 4. The simulations, based on the IBSIMU code, are presented. At the Brookhaven National Laboratory (BNL), a peak beam current of more than 100 mA was demonstrated with a magnetron H- source at an energy of 35 keV and a repetition rate of 2 Hz. A new extraction system is required to operate at an energy of 45 keV; simulation of a two stage extraction system dedicated to the magnetron is presented.

  2. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  3. Electromagnetic Coupling Between High Intensity LHC Beams and the Synchrotron Radiation Monitor Light Extraction System

    CERN Document Server

    Andreazza, W; Bravin, E; Caspers, F; Garlasch`e, M; Gras, J; Goldblatt, A; Lefevre, T; Jones, R; Metral, E; Nosych, A; Roncarolo_, F; Salvant, B; Trad, G; Veness, R; Vollinger, C; Wendt, M

    2013-01-01

    The CERN LHC is equipped with two Synchrotron Radiation Monitor (BSRT) systems used to characterise transverse and longitudinal beam distributions. Since the end of the 2011 LHC run the light extraction system, based on a retractable mirror, has suffered deformation and mechanical failure that is correlated to the increase in beam intensity. Temperature probes have associated these observations to a strong heating of the mirror support with a dependence on the longitudinal bunch length and shape, indicating the origin as electromagnetic coupling between the beam and the structure. This paper combines all this information with the aim of characterising and improving the system in view of its upgrade during the current LHC shutdown. Beam-based observations are presented along with electromagnetic and thermomechanical simulations and complemented by laboratory measurements, including the study of the RF properties of different mirror bulk and coating materials.

  4. An improved approach for flow-based cloud point extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Rocha, Fábio R P

    2014-04-11

    Novel strategies are proposed to circumvent the main drawbacks of flow-based cloud point extraction (CPE). The surfactant-rich phase (SRP) was directly retained into the optical path of the spectrophotometric cell, thus avoiding its dilution previously to the measurement and yielding higher sensitivity. Solenoid micro-pumps were exploited to improve mixing by the pulsed flow and also to modulate the flow-rate for retention and removal of the SRP, thus avoiding the elution step, often carried out with organic solvents. The heat released and the increase of the salt concentration provided by an on-line neutralization reaction were exploited to induce the cloud point without an external heating device. These innovations were demonstrated by the spectrophotometric determination of iron, yielding a linear response from 10 to 200 μg L(-1) with a coefficient of variation of 2.3% (n=7). Detection limit and sampling rate were estimated at 5 μg L(-1) (95% confidence level) and 26 samples per hour, respectively. The enrichment factor was 8.9 and the procedure consumed only 6 μg of TAN and 390 μg of Triton X-114 per determination. At the 95% confidence level, the results obtained for freshwater samples agreed with the reference procedure and those obtained for digests of bovine muscle, rice flour, brown bread and tort lobster agreed with the certified reference values. The proposed procedure thus shows advantages in relation to previously proposed approaches for flow-based CPE, being a fast and environmental friendly alternative for on-line separation and pre-concentration.

  5. Intelligence-based systems engineering

    CERN Document Server

    Tolk, Andreas

    2011-01-01

    The International Council on Systems Engineering (INCOSE) defines Systems Engineering as an interdisciplinary approach and means to enable the realization of successful systems. Researchers are using intelligence-based techniques to support the practices of systems engineering in an innovative way. This research volume includes a selection of contributions by subject experts to design better systems.

  6. Identification of threats using linguistics-based knowledge extraction.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.

    2008-09-01

    One of the challenges increasingly facing intelligence analysts, along with professionals in many other fields, is the vast amount of data which needs to be reviewed and converted into meaningful information, and ultimately into rational, wise decisions by policy makers. The advent of the world wide web (WWW) has magnified this challenge. A key hypothesis which has guided us is that threats come from ideas (or ideology), and ideas are almost always put into writing before the threats materialize. While in the past the 'writing' might have taken the form of pamphlets or books, today's medium of choice is the WWW, precisely because it is a decentralized, flexible, and low-cost method of reaching a wide audience. However, a factor which complicates matters for the analyst is that material published on the WWW may be in any of a large number of languages. In 'Identification of Threats Using Linguistics-Based Knowledge Extraction', we have sought to use Latent Semantic Analysis (LSA) and other similar text analysis techniques to map documents from the WWW, in whatever language they were originally written, to a common language-independent vector-based representation. This then opens up a number of possibilities. First, similar documents can be found across language boundaries. Secondly, a set of documents in multiple languages can be visualized in a graphical representation. These alone offer potentially useful tools and capabilities to the intelligence analyst whose knowledge of foreign languages may be limited. Finally, we can test the over-arching hypothesis--that ideology, and more specifically ideology which represents a threat, can be detected solely from the words which express the ideology--by using the vector-based representation of documents to predict additional features (such as the ideology) within a framework based on supervised learning. In this report, we present the results of a three-year project of the same name. We believe

  7. Novel Doppler Frequency Extraction Method Based on Time-Frequency Analysis and Morphological Operation

    Institute of Scientific and Technical Information of China (English)

    HOU Shu-juan; WU Si-liang

    2006-01-01

    A novel method of Doppler frequency extraction is proposed for Doppler radar scoring systems. The idea is that the time-frequency map can show how the Doppler frequency varies along the time-line, so the Doppler frequency extraction becomes curve detection in the image-view. A set of morphological operations are used to implement curve detection. And a map fusion scheme is presented to eliminate the influence of strong direct current (DC) component of echo signal during curve detection. The radar real-life data are used to illustrate the performance of the new approach. Experimental results show that the proposed method can over come the shortcomings of piecewise-processing-based FFT method and can improve the measuring precision of miss distance.

  8. Recent Application of Solid Phase Based Techniques for Extraction and Preconcentration of Cyanotoxins in Environmental Matrices.

    Science.gov (United States)

    Mashile, Geaneth Pertunia; Nomngongo, Philiswa N

    2017-03-04

    Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.

  9. When cholesterol is not cholesterol: a note on the enzymatic determination of its concentration in model systems containing vegetable extracts

    Directory of Open Access Journals (Sweden)

    Pamplona Reinald

    2010-06-01

    Full Text Available Abstract Background Experimental evidences demonstrate that vegetable derived extracts inhibit cholesterol absorption in the gastrointestinal tract. To further explore the mechanisms behind, we modeled duodenal contents with several vegetable extracts. Results By employing a widely used cholesterol quantification method based on a cholesterol oxidase-peroxidase coupled reaction we analyzed the effects on cholesterol partition. Evidenced interferences were analyzed by studying specific and unspecific inhibitors of cholesterol oxidase-peroxidase coupled reaction. Cholesterol was also quantified by LC/MS. We found a significant interference of diverse (cocoa and tea-derived extracts over this method. The interference was strongly dependent on model matrix: while as in phosphate buffered saline, the development of unspecific fluorescence was inhibitable by catalase (but not by heat denaturation, suggesting vegetable extract derived H2O2 production, in bile-containing model systems, this interference also comprised cholesterol-oxidase inhibition. Several strategies, such as cholesterol standard addition and use of suitable blanks containing vegetable extracts were tested. When those failed, the use of a mass-spectrometry based chromatographic assay allowed quantification of cholesterol in models of duodenal contents in the presence of vegetable extracts. Conclusions We propose that the use of cholesterol-oxidase and/or peroxidase based systems for cholesterol analyses in foodstuffs should be accurately monitored, as important interferences in all the components of the enzymatic chain were evident. The use of adequate controls, standard addition and finally, chromatographic analyses solve these issues.

  10. Extraction of MHD Signal Based on Wavelet Transform

    Institute of Scientific and Technical Information of China (English)

    赵晴初; 赵彤; 李旻; 黄胜华; 徐佩霞

    2002-01-01

    Mirnov signals mixed with interferences are a kind of non-stationary signal. It can not obtain satisfactory effects to extract MHD signals from mirnov signals by Fourier Transform. This paper suggests that the wavelet transform can be used to treat mirnov signals. Theoretical analysis and experimental result have indicated that using the time-frequency analysis characteristics of the wavelet transform to filter mirnov signals can remove effectively interferences and extract useful MHD signals.

  11. The determination of organochlorine pesticides based on dynamic microwave-assisted extraction coupled with on-line solid-phase extraction of high-performance liquid chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Chen Ligang [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Ding Lan [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China)]. E-mail: analchem@jlu.edu.cn; Jin Haiyan [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Song Daqian [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Zhang Huarong [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Li Jiantao [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Zhang Kun [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Wang Yutang [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China); Zhang Hanqi [College of Chemistry, Jilin University, 2699 Qianjin Street, Changchun 130012 (China)

    2007-04-25

    A rapid technique based on dynamic microwave-assisted extraction coupled with on-line solid-phase extraction of high-performance liquid chromatography (DMAE-SPE-HPLC) has been developed. A TM{sub 010} microwave resonance cavity built in the laboratory was applied to concentrate the microwave energy. The sample placed in the zone of microwave irradiation was extracted with 95% acetonitrile (ACN) aqueous solution which was driven by a peristaltic pump at a flow rate of 1.0 mL min{sup -1}. The extraction can be completed in a recirculating system in 10 min. When a number of extraction cycles were completed, the extract (1 mL) was diluted on-line with water. Then the extract was loaded into an SPE column where the analytes were retained while the unretained matrix components were washed away. Subsequently, the analytes were automatically transferred from the SPE column to the analytical column and determined by UV detector at 238 nm. The technique was used for determination of organochlorine pesticides (OCPs) in grains, including wheat, rice, corn and bean. The limits of detection of OCPs are in the range of 19-37 ng g{sup -1}. The recoveries obtained by analyzing the four spiked grain samples are in the range of 86-105%, whereas the relative standard deviation (R.S.D.) values are <8.7% ranging from 1.2 to 8.7%. Our method was demonstrated to be fast, accurate, and precise. In addition, only small quantities of solvent and sample were required.

  12. Extraction and separation of tungsten (VI) from aqueous media with Triton X-100-ammonium sulfate-water aqueous two-phase system without any extractant.

    Science.gov (United States)

    Yongqiang Zhang; Tichang Sun; Tieqiang Lu; Chunhuan Yan

    2016-11-25

    An aqueous two-phase system composed of Triton X-100-(NH4)2SO4-H2O was proposed for extraction and separation of tungsten(VI) from aqueous solution without using any extractant. The effects of aqueous pH, concentration of ammonium sulfate, Triton X-100 and tungsten, extracting temperature on the extraction of tungsten were investigated. The extraction of tungsten has remarkable relationship with aqueous pH and are to above 90% at pH=1.0-3.0 under studied pH range (pH=1.0-7.0) and increases gradually with increasing Triton X-100 concentration, but decreases slightly with increasing ammonium sulfate concentration. The extraction percentage of tungsten is hardly relevant to temperature but its distribution coefficient linearly increases with increasing temperature within 303.15-343.15K. The distribution coefficient of tungsten increases with the increase of initial tungsten concentration (0.1-3%) and temperature (303.15 K-333.15K). The solubilization capacity of tungsten in Triton X-100 micellar phase is independent of temperature. FT-IR analysis reveals that there is no evident interaction between polytungstate anion and ether oxygen unit in Triton X-100, and DLS analysis indicates that zeta potential of Triton X-100 micellar phase have a little change from positive to negative after extracting tungsten. Based on the above-mentioned results, it can be deduced that polytungstate anions are solubilized in hydrophilic outer shell of Triton X-100 micelles by electrostatic attraction depending on its relatively high hydrophobic nature. The stripping of tungsten is mainly influenced by temperature and can be easily achieved to 95% in single stage stripping. The tungsten (VI) is separated out from solution containing Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Al(III), Cr(III) and Mn(II) under the suitable conditions.

  13. AutoMate Express™ forensic DNA extraction system for the extraction of genomic DNA from biological samples.

    Science.gov (United States)

    Liu, Jason Y; Zhong, Chang; Holt, Allison; Lagace, Robert; Harrold, Michael; Dixon, Alan B; Brevnov, Maxim G; Shewale, Jaiprakash G; Hennessy, Lori K

    2012-07-01

    The AutoMate Express™ Forensic DNA Extraction System was developed for automatic isolation of DNA from a variety of forensic biological samples. The performance of the system was investigated using a wide range of biological samples. Depending on the sample type, either PrepFiler™ lysis buffer or PrepFiler BTA™ lysis buffer was used to lyse the samples. After lysis and removal of the substrate using LySep™ column, the lysate in the sample tubes were loaded onto AutoMate Express™ instrument and DNA was extracted using one of the two instrument extraction protocols. Our study showed that DNA was recovered from as little as 0.025 μL of blood. DNA extracted from casework-type samples was free of detectable PCR inhibitors and the short tandem repeat profiles were complete, conclusive, and devoid of any PCR artifacts. The system also showed consistent performance from day-to-day operation. 2012 American Academy of Forensic Sciences. Published 2012. This article is a U.S. Government work and is in the public domain in the U.S.A.

  14. Modeling and tissue parameter extraction challenges for free space broadband fNIR brain imaging systems

    Science.gov (United States)

    Sultan, E.; Manseta, K.; Khwaja, A.; Najafizadeh, L.; Gandjbakhche, A.; Pourrezaei, K.; Daryoush, A. S.

    2011-02-01

    Fiber based functional near infra-red (fNIR) spectroscopy has been considered as a cost effective imaging modality. To achieve a better spatial resolution and greater accuracy in extraction of the optical parameters (i.e., μa and μ's), broadband frequency modulated systems covering multi-octave frequencies of 10-1000MHz is considered. A helmet mounted broadband free space fNIR system is considered as significant improvement over bulky commercial fiber fNIR realizations that are inherently uncomfortable and dispersive for broadband operation. Accurate measurements of amplitude and phase of the frequency modulated NIR signals (670nm, 795nm, and 850nm) is reported here using free space optical transmitters and receivers realized in a small size and low cost modules. The tri-wavelength optical transmitter is based on vertical cavity semiconductor lasers (VCSEL), whereas the sensitive optical receiver is based on either PIN or APD photodiodes combined with transimpedance amplifiers. This paper also has considered brain phantoms to perform optical parameter extraction experiments using broadband modulated light for separations of up to 5cm. Analytical models for predicting forward (transmittance) and backward (reflectance) scattering of modulated photons in diffused media has been modeled using Diffusion Equation (DE). The robustness of the DE modeling and parameter extraction algorithm was studied by experimental verification of multi-layer diffused media phantoms. In particular, comparison between analytical and experimental models for narrow band and broadband has been performed to analyze the advantages of our broadband fNIR system.

  15. Antimicrobial thin films based on ayurvedic plants extracts embedded in a bioactive glass matrix

    Science.gov (United States)

    Floroian, L.; Ristoscu, C.; Candiani, G.; Pastori, N.; Moscatelli, M.; Mihailescu, N.; Negut, I.; Badea, M.; Gilca, M.; Chiesa, R.; Mihailescu, I. N.

    2017-09-01

    Ayurvedic medicine is one of the oldest medical systems. It is an example of a coherent traditional system which has a time-tested and precise algorithm for medicinal plant selection, based on several ethnopharmacophore descriptors which knowledge endows the user to adequately choose the optimal plant for the treatment of certain pathology. This work aims for linking traditional knowledge with biomedical science by using traditional ayurvedic plants extracts with antimicrobial effect in form of thin films for implant protection. We report on the transfer of novel composites from bioactive glass mixed with antimicrobial plants extracts and polymer by matrix-assisted pulsed laser evaporation into uniform thin layers onto stainless steel implant-like surfaces. The comprehensive characterization of the deposited films was performed by complementary analyses: Fourier transformed infrared spectroscopy, glow discharge optical emission spectroscopy, scanning electron microscopy, atomic force microscopy, electrochemical impedance spectroscopy, UV-VIS absorption spectroscopy and antimicrobial tests. The results emphasize upon the multifunctionality of these coatings which allow to halt the leakage of metal and metal oxides into the biological fluids and eventually to inner organs (by polymer use), to speed up the osseointegration (due to the bioactive glass use), to exert antimicrobial effects (by ayurvedic plants extracts use) and to decrease the implant price (by cheaper stainless steel use).

  16. A New Augmentation Based Algorithm for Extracting Maximal Chordal Subgraphs.

    Science.gov (United States)

    Bhowmick, Sanjukta; Chen, Tzu-Yi; Halappanavar, Mahantesh

    2015-02-01

    A graph is chordal if every cycle of length greater than three contains an edge between non-adjacent vertices. Chordal graphs are of interest both theoretically, since they admit polynomial time solutions to a range of NP-hard graph problems, and practically, since they arise in many applications including sparse linear algebra, computer vision, and computational biology. A maximal chordal subgraph is a chordal subgraph that is not a proper subgraph of any other chordal subgraph. Existing algorithms for computing maximal chordal subgraphs depend on dynamically ordering the vertices, which is an inherently sequential process and therefore limits the algorithms' parallelizability. In this paper we explore techniques to develop a scalable parallel algorithm for extracting a maximal chordal subgraph. We demonstrate that an earlier attempt at developing a parallel algorithm may induce a non-optimal vertex ordering and is therefore not guaranteed to terminate with a maximal chordal subgraph. We then give a new algorithm that first computes and then repeatedly augments a spanning chordal subgraph. After proving that the algorithm terminates with a maximal chordal subgraph, we then demonstrate that this algorithm is more amenable to parallelization and that the parallel version also terminates with a maximal chordal subgraph. That said, the complexity of the new algorithm is higher than that of the previous parallel algorithm, although the earlier algorithm computes a chordal subgraph which is not guaranteed to be maximal. We experimented with our augmentation-based algorithm on both synthetic and real-world graphs. We provide scalability results and also explore the effect of different choices for the initial spanning chordal subgraph on both the running time and on the number of edges in the maximal chordal subgraph.

  17. Comparison of solvent/derivatization agent systems for determination of extractable toluene diisocyanate from flexible polyurethane foam.

    Science.gov (United States)

    Vangronsveld, Erik; Berckmans, Steven; Spence, Mark

    2013-06-01

    Flexible polyurethane foam (FPF) is produced from the reaction of toluene diisocyanate (TDI) and polyols. Limited and conflicting results exist in the literature concerning the presence of unreacted TDI remaining in FPF as determined by various solvent extraction and analysis techniques. This study reports investigations into the effect of several solvent/derivatization agent combinations on extractable TDI results and suggests a preferred method. The suggested preferred method employs a syringe-based multiple extraction of foam samples with a toluene solution of 1-(2-methoxyphenyl)-piperazine. Extracts are analyzed by liquid chromatography using an ion trap mass spectrometry detection technique. Detection limits of the method are ~10ng TDI g(-1) foam (10 ppb, w/w) for each TDI isomer (i.e. 2,4-TDI and 2,6-TDI). The method was evaluated by a three-laboratory interlaboratory comparison using two representative foam samples. The total extractable TDI results found by the three labs for the two foams were in good agreement (relative standard deviation of the mean of 30-40%). The method has utility as a basis for comparing FPFs, but the interpretation of extractable TDI results using any solvent as the true value for 'free' or 'unreacted' TDI in the foam is problematic, as demonstrated by the difference in the extracted TDI results from the different extraction systems studied. Further, a consideration of polyurethane foam chemistry raises the possibility that extractable TDI may result from decomposition of parts of the foam structure (e.g. dimers, biurets, and allophanates) by the extraction system.

  18. Image and Video based double watermark extraction spread spectrum watermarking in low variance region

    Directory of Open Access Journals (Sweden)

    Mriganka Gogoi

    2013-07-01

    Full Text Available Digital watermarking plays a very important role in copyright protection. It is one of the techniques which are used for safeguarding the origins of the image, audio and video by protecting it against Piracy. This paper proposes a low variance based spread spectrum watermarking for image and video in which the watermark is obtained twice in the receiver. The watermark to be added is a binary image of comparatively smaller size than the Cover Image. Cover Image is divided into number of 8x8 blocks and transform into frequency domain using Discrete Cosine Transform. A gold sequence is added as well as subtracted in each block for each watermark bit. In most cases, researchers has generally used algorithms for extracting single watermark and also it is seen that finding the location of the distorted bit of the watermark due to some attacks is one of the most challenging task. However, in this paper the same watermark is embedded as well as extracted twice with gold code without much distortion of the image and comparing these two watermarks will help in finding the distorted bit. Another feature is that as this algorithm is based on embedding of watermark in low variance region, therefore proper extraction of the watermark is obtained at a lesser modulating factor. The proposed algorithm is very much useful in applications like real-time broad casting, image and video authentication and secure camera system. The experimental results show that the watermarking technique is robust against various attacks.

  19. SANB-SEB Clustering: A Hybrid Ontology Based Image and Webpage Retrieval for Knowledge Extraction

    Directory of Open Access Journals (Sweden)

    Anna Saro Vijendran

    2014-12-01

    Full Text Available Data mining is a hype-word and its major goal is to extract the information from the dataset and convert it into readable format. Web mining is one of the applications of data mining which helps to extract the web page. Personalized image was retrieved in existing systems by using tag-annotation-demand ranking for image retrieval (TAD where image uploading, query searching, and page refreshing steps were taken place. In the proposed work, both the image and web page are retrieved by several techniques. Two major steps are followed in this work, where the primary step is server database upload. Herein, database for both image and content are stored using block acquiring page segmentation (BAPS. The subsequent step is to extract the image and content from the respective server database. The subsequent database is further applied into semantic annotation based clustering (SANB (for image and semantic based clustering (SEB (for content. The experimental results show that the proposed approach accurately retrieves both the images and relevant pages.

  20. THE YARLUNG ZANGBO RIVER EXTRACTION AND CHANGE DETECTION BASED ON LANDSAT SERIES

    Directory of Open Access Journals (Sweden)

    Y. J. Cao

    2017-09-01

    Full Text Available Yarlung Zangbo River is one of the most important rivers in the southwest river source area, it is the longest plateau river in China, the cradle of the birth and development of Tibet civilization, and also an international water system. It is of great significance to study its water resources utilization and water environment protection. This paper compared different river extraction methods, including the normalized difference water index, multi-band spectral correlation threshold method, maximum likelihood classification method and object-oriented classification method, based on Landsat images, and combined with remote sensing and GIS technologies. The study area is Yarlung Zangbo River, one of the main rivers in southwest river source region. Meanwhile, river changes have been analysed based on upstream and midstream of Yarlung Zangbo River extracting results of four images in different periods. The result shows that object-oriented classification method has advantage of removing mountain shadow, its accuracy of river extraction is the highest, and Yarlung Zangbo River area shows a decreasing trend from 2000 to 2016, and there are some changes of watercourse in midstream as well.

  1. Three-Dimensional Precession Feature Extraction of Ballistic Targets Based on Narrowband Radar Network

    Directory of Open Access Journals (Sweden)

    Zhao Shuang

    2017-02-01

    Full Text Available Micro-motion is a crucial feature used in ballistic target recognition. To address the problem that single-view observations cannot extract true micro-motion parameters, we propose a novel algorithm based on the narrowband radar network to extract three-dimensional precession features. First, we construct a precession model of the cone-shaped target, and as a precondition, we consider the invisible problem of scattering centers. We then analyze in detail the micro-Doppler modulation trait caused by the precession. Then, we match each scattering center in different perspectives based on the ratio of the top scattering center’s micro-Doppler frequency modulation coefficient and extract the 3D coning vector of the target by establishing associated multi-aspect equation systems. In addition, we estimate feature parameters by utilizing the correlation of the micro-Doppler frequency modulation coefficient of the three scattering centers combined with the frequency compensation method. We then calculate the coordinates of the conical point in each moment and reconstruct the 3D spatial portion. Finally, we provide simulation results to validate the proposed algorithm.

  2. 基于工作研究的语块提取系统PhrasExt软件设计%Chunk Based on the Work of the Extraction System Software Design PhrasExt

    Institute of Scientific and Technical Information of China (English)

    熊秋平; 管新潮

    2011-01-01

    This paper applied the “work study” method of the classic Industrial Engineering to the preparation of chunk extraction software of bilingual parallel corpus. Expounded the “work study” of the two basic tools 5W1H and ECRS block in the preparation of chunk extraction applications. Results show that extraction software PhrasExt is better than the traditional extraction software.%将经典工业工程中的"工作研究"方法,应用到双语平行语料库语块提取软件的编制中.阐述了"工作研究"的两个基本工具5W1H和ECRS在编制语块提取软件时的应用.试验结果显示,软件PharasExt提取效果明显优于传统语块提取软件.

  3. An Investigation into Error Source Identification of Machine Tools Based on Time-Frequency Feature Extraction

    Directory of Open Access Journals (Sweden)

    Dongju Chen

    2016-01-01

    Full Text Available This paper presents a new identification method to identify the main errors of the machine tool in time-frequency domain. The low- and high-frequency signals of the workpiece surface are decomposed based on the Daubechies wavelet transform. With power spectral density analysis, the main features of the high-frequency signal corresponding to the imbalance of the spindle system are extracted from the surface topography of the workpiece in the frequency domain. With the cross-correlation analysis method, the relationship between the guideway error of the machine tool and the low-frequency signal of the surface topography is calculated in the time domain.

  4. Key Frame Extraction for Text Based Video Retrieval Using Maximally Stable Extremal Regions

    Directory of Open Access Journals (Sweden)

    Werachard Wattanarachothai

    2015-04-01

    Full Text Available This paper presents a new approach for text-based video content retrieval system. The proposed scheme consists of three main processes that are key frame extraction, text localization and keyword matching. For the key-frame extraction, we proposed a Maximally Stable Extremal Region (MSER based feature which is oriented to segment shots of the video with different text contents. In text localization process, in order to form the text lines, the MSERs in each key frame are clustered based on their similarity in position, size, color, and stroke width. Then, Tesseract OCR engine is used for recognizing the text regions. In this work, to improve the recognition results, we input four images obtained from different pre-processing methods to Tesseract engine. Finally, the target keyword for querying is matched with OCR results based on an approximate string search scheme. The experiment shows that, by using the MSER feature, the videos can be segmented by using efficient number of shots and provide the better precision and recall in comparison with a sum of absolute difference and edge based method.

  5. Carrier system for a plant extract or bioactive compound from a plant

    DEFF Research Database (Denmark)

    2016-01-01

    This invention relates to a carrier system for use in producing a beverage with a metered amount of plant extract or bioactive compound.......This invention relates to a carrier system for use in producing a beverage with a metered amount of plant extract or bioactive compound....

  6. Effects of petroleum ether extract of Amorphophallus paeoniifolius tuber on central nervous system in mice

    Directory of Open Access Journals (Sweden)

    Das S

    2009-01-01

    Full Text Available The central nervous system activity of the petroleum ether extract of Amorphophallus paeoniifolius tuber was examined in mice, fed normal as well as healthy conditions. The petroleum ether extract of Amorphophallus paeoniifolius tuber at the doses of 100, 300 and 1000 mg/kg showed significant central nervous system activity in mice.

  7. Robust Speech Recognition Method Based on Discriminative Environment Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    HAN Jiqing; GAO Wen

    2001-01-01

    It is an effective approach to learn the influence of environmental parameters,such as additive noise and channel distortions, from training data for robust speech recognition.Most of the previous methods are based on maximum likelihood estimation criterion. However,these methods do not lead to a minimum error rate result. In this paper, a novel discrimina-tive learning method of environmental parameters, which is based on Minimum ClassificationError (MCE) criterion, is proposed. In the method, a simple classifier and the Generalized Probabilistic Descent (GPD) algorithm are adopted to iteratively learn the environmental parameters. Consequently, the clean speech features are estimated from the noisy speech features with the estimated environmental parameters, and then the estimations of clean speech features are utilized in the back-end HMM classifier. Experiments show that the best error rate reduction of 32.1% is obtained, tested on a task of 18 isolated confusion Korean words, relative to a conventional HMM system.

  8. Infrared moving small target detection based on saliency extraction and image sparse representation

    Science.gov (United States)

    Zhang, Xiaomin; Ren, Kan; Gao, Jin; Li, Chaowei; Gu, Guohua; Wan, Minjie

    2016-10-01

    Moving small target detection in infrared image is a crucial technique of infrared search and tracking system. This paper present a novel small target detection technique based on frequency-domain saliency extraction and image sparse representation. First, we exploit the features of Fourier spectrum image and magnitude spectrum of Fourier transform to make a rough extract of saliency regions and use a threshold segmentation system to classify the regions which look salient from the background, which gives us a binary image as result. Second, a new patch-image model and over-complete dictionary were introduced to the detection system, then the infrared small target detection was converted into a problem solving and optimization process of patch-image information reconstruction based on sparse representation. More specifically, the test image and binary image can be decomposed into some image patches follow certain rules. We select the target potential area according to the binary patch-image which contains salient region information, then exploit the over-complete infrared small target dictionary to reconstruct the test image blocks which may contain targets. The coefficients of target image patch satisfy sparse features. Finally, for image sequence, Euclidean distance was used to reduce false alarm ratio and increase the detection accuracy of moving small targets in infrared images due to the target position correlation between frames.

  9. A silica gel based method for extracting insect surface hydrocarbons.

    Science.gov (United States)

    Choe, Dong-Hwan; Ramírez, Santiago R; Tsutsui, Neil D

    2012-02-01

    Here, we describe a novel method for the extraction of insect cuticular hydrocarbons using silica gel, herein referred to as "silica-rubbing". This method permits the selective sampling of external hydrocarbons from insect cuticle surfaces for subsequent analysis using gas chromatography-mass spectrometry (GC-MS). The cuticular hydrocarbons are first adsorbed to silica gel particles by rubbing the cuticle of insect specimens with the materials, and then are subsequently eluted using organic solvents. We compared the cuticular hydrocarbon profiles that resulted from extractions using silica-rubbing and solvent-soaking methods in four ant and one bee species: Linepithema humile, Azteca instabilis, Camponotus floridanus, Pogonomyrmex barbatus (Hymenoptera: Formicidae), and Euglossa dilemma (Hymenoptera: Apidae). We also compared the hydrocarbon profiles of Euglossa dilemma obtained via silica-rubbing and solid phase microextraction (SPME). Comparison of hydrocarbon profiles obtained by different extraction methods indicates that silica rubbing selectively extracts the hydrocarbons that are present on the surface of the cuticular wax layer, without extracting hydrocarbons from internal glands and tissues. Due to its surface specificity, efficiency, and low cost, this new method may be useful for studying the biology of insect cuticular hydrocarbons.

  10. Extraction of traditional COP-based features from COM sway in postural stability evaluation.

    Science.gov (United States)

    Romano, F; Colagiorgio, P; Buizza, A; Sardi, F; Ramat, S

    2015-08-01

    Postural control during quiet standing is evaluated by analyzing CoP sway, easily measured using a force platform. However, recent proliferation of motion tracking systems made easily available an estimate of the CoM location. Traditional CoP-based measures presented in literature provide information about age-related changes in postural stability and fall risk. We investigated, on an age-matched group of subjects, the relationship between classical CoP-based measures computed on sway path and statistical mechanics parameters on diffusion plot, with those extracted from CoM time-series. Our purpose is to understand which of these parameters, computed on CoM sway, can discriminate postural abnormalities, in order to use a video tracking system to evaluate balance in addition to motor capabilities.

  11. Characteristic of synergistic extraction of oxalic acid with system from rare earth metallurgical wastewater

    Institute of Scientific and Technical Information of China (English)

    QIU

    2010-01-01

    Large amount of high concentration acidic wastewater would be produced in the conversion process of chloride rare earth into oxide rare earth.It was a mixed solution of oxalic acid and hydrochloric acid,so the recycling use was very difficult.The method of liquid-liquid extraction was proposed in this paper to achieve wastewater treatment and reclamation.The mechanism of extraction of oxalic acid from the wastewater with the systems of 50% TOB+45% kerosene and 5% 2-ethyl hexanol was investigated.The composition and structure of the extracted species and the establishment of the mathematical model of the oxalic acid extraction were determined by the use of saturation method,equimolar series method.The results showed that extraction of oxalic acid by TOB was a neutral association extraction,oxalic acid existed mainly in a molecular form in the organic phase,and the extraction combination ratio was 2:1.The duality extraction system composed of extractant TOB and TOC had synergistic extraction effect on oxalic acid and chlorhydric acid,and the extraction dislribution ratio was improved greatly.The optimum volume fiaction of TOB was 0.6-0.8.

  12. Social network extraction and analysis based on multimodal dyadic interaction.

    Science.gov (United States)

    Escalera, Sergio; Baró, Xavier; Vitrià, Jordi; Radeva, Petia; Raducanu, Bogdan

    2012-01-01

    Social interactions are a very important component in people's lives. Social network analysis has become a common technique used to model and quantify the properties of social interactions. In this paper, we propose an integrated framework to explore the characteristics of a social network extracted from multimodal dyadic interactions. For our study, we used a set of videos belonging to New York Times' Blogging Heads opinion blog. The Social Network is represented as an oriented graph, whose directed links are determined by the Influence Model. The links' weights are a measure of the "influence" a person has over the other. The states of the Influence Model encode automatically extracted audio/visual features from our videos using state-of-the art algorithms. Our results are reported in terms of accuracy of audio/visual data fusion for speaker segmentation and centrality measures used to characterize the extracted social network.

  13. Social Network Extraction and Analysis Based on Multimodal Dyadic Interaction

    Directory of Open Access Journals (Sweden)

    Bogdan Raducanu

    2012-02-01

    Full Text Available Social interactions are a very important component in people’s lives. Social network analysis has become a common technique used to model and quantify the properties of social interactions. In this paper, we propose an integrated framework to explore the characteristics of a social network extracted from multimodal dyadic interactions. For our study, we used a set of videos belonging to New York Times’ Blogging Heads opinion blog. The Social Network is represented as an oriented graph, whose directed links are determined by the Influence Model. The links’ weights are a measure of the “influence” a person has over the other. The states of the Influence Model encode automatically extracted audio/visual features from our videos using state-of-the art algorithms. Our results are reported in terms of accuracy of audio/visual data fusion for speaker segmentation and centrality measures used to characterize the extracted social network.

  14. A Karnaugh-Map based fingerprint minutiae extraction method

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Fingerprint is one of the most promising method among all the biometric techniques and has been used for thepersonal authentication for a long time because of its wide acceptance and reliability. Features (Minutiae are extracted fromthe fingerprint in question and are compared with the features already stored in the database for authentication. Crossingnumber (CN is the most commonly used minutiae extraction method for fingerprints. In this paper, a new Karnaugh-Mapbased fingerprint minutiae extraction method has been proposed and discussed. In the proposed algorithm the 8 neighborsof a pixel in a 33 window are arranged as 8 bits of a byte and corresponding hexadecimal (hex value is calculated. Thesehex values are simplified using standard Karnaugh-Map (K-map technique to obtain the minimized logical expression.Experiments conducted on the FVC2002/Db1_a database reveals that the developed method is better than the crossingnumber (CN method.

  15. Arc detector system for extraction switches in LHC CERN

    CERN Document Server

    Dahlerup-Petersen, K; Kuper, E; Ovchar, V; Zverev, S

    2006-01-01

    The opening switches, which will be used in case of quenches or other failures in CERN’s future LHC collider to extract the large amounts of energy stored in the magnetic field of the superconducting chains of main dipoles (8 chains with 1350 MJ each) and main quadrupoles (16 chains with about 24 MJ each) consist of an array of series/parallel connected, electro-mechanical D.C. breakers, specifically designed for this particular application. During the opening process the magnet excitation current is transferred from the cluster of breakers to extraction resistors for rapid de-excitation of the magnet chain. An arc detector has been developed in order to facilitate the determination of the need for maintenance interventions on the switches. The paper describes the arc detector and highlight results from operation of the detector with a LHC pilot extraction...

  16. SRBIR: Semantic Region Based Image Retrieval by Extracting the Dominant Region and Semantic Learning

    Directory of Open Access Journals (Sweden)

    I. F. Rajam

    2011-01-01

    Full Text Available Problem statement: The Semantic Region Based Image Retrieval (SRBIR system that automatically segments the dominant foreground region, consisting of the semantic concept of the image, such as elephants, roses and does the semantic learning, is proposed. Approach: The system segments an image into different regions and finds the dominant foreground region in it, which is the semantic concept of that image. Then it extracts the low-level features of that dominant foreground region. The Support Vector Machine-Binary Decision Tree (SVM-BDT is used for semantic learning and it finds the semantic category of an image. The low level features of the dominant region of each category image are used to find the semantic template of that category. The SVM-BDT is constructed with the help of these semantic templates. The high level concept of the query image is obtained using this SVM-BDT. Similarity matching is done between the query image and the set of images belonging to the semantic category of the query image and the top images with least distances are retrieved. Results: Experiments were conducted using the COREL dataset consisting of 10,000 images and its subset with 1000 images of 10 different semantic categories. The obtained results demonstrate the effectiveness of the proposed framework, compared to those of the commonly used region based image retrieval approaches. Conclusion: Efficient image searching, browsing and retrieval are required by users from various domains, such as medicine, fashion, architecture, training and teaching. The proposed SRBIR system aims at retrieving images based on their semantic content by extracting the dominant foreground region in the image and learning its semantic concept with the help of the SVM-BDT. The proposed SRBIR system provides an efficient image search based on semantics, with high accuracy and less access time.

  17. Application of ionic liquids based enzyme-assisted extraction of chlorogenic acid from Eucommia ulmoides leaves

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Tingting; Sui, Xiaoyu, E-mail: suixiaoyu@outlook.com; Li, Li; Zhang, Jie; Liang, Xin; Li, Wenjing; Zhang, Honglian; Fu, Shuang

    2016-01-15

    A new approach for ionic liquid based enzyme-assisted extraction (ILEAE) of chlorogenic acid (CGA) from Eucommia ulmoides is presented in which enzyme pretreatment was used in ionic liquids aqueous media to enhance extraction yield. For this purpose, the solubility of CGA and the activity of cellulase were investigated in eight 1-alkyl-3-methylimidazolium ionic liquids. Cellulase in 0.5 M [C6mim]Br aqueous solution was found to provide better performance in extraction. The factors of ILEAE procedures including extraction time, extraction phase pH, extraction temperatures and enzyme concentrations were investigated. Moreover, the novel developed approach offered advantages in term of yield and efficiency compared with other conventional extraction techniques. Scanning electronic microscopy of plant samples indicated that cellulase treated cell wall in ionic liquid solution was subjected to extract, which led to more efficient extraction by reducing mass transfer barrier. The proposed ILEAE method would develope a continuous process for enzyme-assisted extraction including enzyme incubation and solvent extraction process. In this research, we propose a novel view for enzyme-assisted extraction of plant active component, besides concentrating on enzyme facilitated cell wall degradation, focusing on improvement of bad permeability of ionic liquids solutions. - Highlights: • An ionic liquid based enzyme-assisted extraction method of natural product was explored. • ILEAE utilizes enzymatic treatment to improve permeability of ionic liquids solution. • Enzyme incubation and solvent extraction process were ongoing simultaneously. • ILEAE process simplified operating process and suitable for more complete extraction.

  18. Vibration extraction based on fast NCC algorithm and high-speed camera.

    Science.gov (United States)

    Lei, Xiujun; Jin, Yi; Guo, Jie; Zhu, Chang'an

    2015-09-20

    In this study, a high-speed camera system is developed to complete the vibration measurement in real time and to overcome the mass introduced by conventional contact measurements. The proposed system consists of a notebook computer and a high-speed camera which can capture the images as many as 1000 frames per second. In order to process the captured images in the computer, the normalized cross-correlation (NCC) template tracking algorithm with subpixel accuracy is introduced. Additionally, a modified local search algorithm based on the NCC is proposed to reduce the computation time and to increase efficiency significantly. The modified algorithm can rapidly accomplish one displacement extraction 10 times faster than the traditional template matching without installing any target panel onto the structures. Two experiments were carried out under laboratory and outdoor conditions to validate the accuracy and efficiency of the system performance in practice. The results demonstrated the high accuracy and efficiency of the camera system in extracting vibrating signals.

  19. The BioExtract Server: a web-based bioinformatic workflow platform.

    Science.gov (United States)

    Lushbough, Carol M; Jennewein, Douglas M; Brendel, Volker P

    2011-07-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet.

  20. Multiple Adaptive Neuro-Fuzzy Inference System with Automatic Features Extraction Algorithm for Cervical Cancer Recognition

    Directory of Open Access Journals (Sweden)

    Mohammad Subhi Al-batah

    2014-01-01

    Full Text Available To date, cancer of uterine cervix is still a leading cause of cancer-related deaths in women worldwide. The current methods (i.e., Pap smear and liquid-based cytology (LBC to screen for cervical cancer are time-consuming and dependent on the skill of the cytopathologist and thus are rather subjective. Therefore, this paper presents an intelligent computer vision system to assist pathologists in overcoming these problems and, consequently, produce more accurate results. The developed system consists of two stages. In the first stage, the automatic features extraction (AFE algorithm is performed. In the second stage, a neuro-fuzzy model called multiple adaptive neuro-fuzzy inference system (MANFIS is proposed for recognition process. The MANFIS contains a set of ANFIS models which are arranged in parallel combination to produce a model with multi-input-multioutput structure. The system is capable of classifying cervical cell image into three groups, namely, normal, low-grade squamous intraepithelial lesion (LSIL and high-grade squamous intraepithelial lesion (HSIL. The experimental results prove the capability of the AFE algorithm to be as effective as the manual extraction by human experts, while the proposed MANFIS produces a good classification performance with 94.2% accuracy.

  1. Comparative of signal processing techniques for micro-Doppler signature extraction with automotive radar systems

    Science.gov (United States)

    Rodriguez-Hervas, Berta; Maile, Michael; Flores, Benjamin C.

    2014-05-01

    In recent years, the automotive industry has experienced an evolution toward more powerful driver assistance systems that provide enhanced vehicle safety. These systems typically operate in the optical and microwave regions of the electromagnetic spectrum and have demonstrated high efficiency in collision and risk avoidance. Microwave radar systems are particularly relevant due to their operational robustness under adverse weather or illumination conditions. Our objective is to study different signal processing techniques suitable for extraction of accurate micro-Doppler signatures of slow moving objects in dense urban environments. Selection of the appropriate signal processing technique is crucial for the extraction of accurate micro-Doppler signatures that will lead to better results in a radar classifier system. For this purpose, we perform simulations of typical radar detection responses in common driving situations and conduct the analysis with several signal processing algorithms, including short time Fourier Transform, continuous wavelet or Kernel based analysis methods. We take into account factors such as the relative movement between the host vehicle and the target, and the non-stationary nature of the target's movement. A comparison of results reveals that short time Fourier Transform would be the best approach for detection and tracking purposes, while the continuous wavelet would be the best suited for classification purposes.

  2. Development of novel extractants for the recycle system of transuranium elements from nuclear fuel-3

    Energy Technology Data Exchange (ETDEWEB)

    Goto, Masahiro [Kyushu Univ., Fukuoka (Japan). Faculty of Engineering

    1998-03-01

    Novel bi-functional extractants which have two organophosphorus moieties in the molecular structure were designed and synthesized for the recycle system of transuranium elements using liquid-liquid extraction. The separation efficiency and extraction ability of the newly synthesized extractants were investigated for rare earth metals. The new extractants have an high extractability to the rare earth metals compared with that of commercially available phosphorus extractants. The obtained results suggest that the extraction and separation abilities are highly sensitive to the molecular structure of the spacer connecting the two functional phosphorus groups. The results of thermodynamic analysis for extraction equilibrium indicate that the entropy effect on the extraction is one of the key factors to enhance the selectivity in the rare earth extractions. Furthermore, a computer analysis was carried out to evaluate the extraction properties for the extraction of rare earth metals by the bi-functional extractants. It is demonstrated that the new concept to connect some functional moieties with a spacer is very useful and is a promising method to develop new extractants for the treatment of nuclear fuel. We have proposed a novel molecular imprinting technique for the treatment of waste nuclear solutions. A surface-imprinting resin was prepared by an emulsion polymerization using a novel organophosphorus extractant as a host monomer for rare earth metals. The host monomer which has amphiphilic nature forms a complex with a rare earth metal ion at the interface, and the complex remains as it is. After the matrix is polymerized, the coordination structure is `imprinted` at the resin interface. The imprinted resins exhibited a high adsorption selectivity to the target Dy ion. We believe that the novel imprint techniques will be useful for the treatment of nuclear waste water. (J.P.N.)

  3. Ionic liquids based microwave-assisted extraction of lichen compounds with quantitative spectrophotodensitometry analysis.

    Science.gov (United States)

    Bonny, Sarah; Paquin, Ludovic; Carrié, Daniel; Boustie, Joël; Tomasi, Sophie

    2011-11-30

    Ionic liquids based extraction method has been applied to the effective extraction of norstictic acid, a common depsidone isolated from Pertusaria pseudocorallina, a crustose lichen. Five 1-alkyl-3-methylimidazolium ionic liquids (ILs) differing in composition of alkyl chain and anion were investigated for extraction efficiency. The extraction amount of norstictic acid was determined after recovery on HPTLC with a spectrophotodensitometer. The proposed approaches (IL-MAE and IL-heat extraction (IL-HE)) have been evaluated in comparison with usual solvents such as tetrahydrofuran in heat-reflux extraction and microwave-assisted extraction (MAE). The results indicated that both the characteristics of the alkyl chain and anion influenced the extraction of polyphenolic compounds. The sulfate-based ILs [C(1)mim][MSO(4)] and [C(2)mim][ESO(4)] presented the best extraction efficiency of norstictic acid. The reduction of the extraction times between HE and MAE (2 h-5 min) and a non-negligible ratio of norstictic acid in total extract (28%) supports the suitability of the proposed method. This approach was successfully applied to obtain additional compounds from other crustose lichens (Pertusaria amara and Ochrolechia parella).

  4. Design and Implementation of SMS Extraction and Analysis System

    Directory of Open Access Journals (Sweden)

    Wan Yu-Yang

    2016-01-01

    Full Text Available With more and more applications of smart phone, more and more text messages in mobile phone, and then more and more information possibly contained in text messages. How to dig out useful information form SMS? This paper discusses techniques of the SMS extraction and analysis. Taking the bank SMS as example, key information is extracted to inform, is formatted to story in APP database, and then be analysed and statistic result shown in chart. The APP with the function is run well on Android phone and has Practical value. This technology helps to expand the application of SMS.

  5. Event extraction of bacteria biotopes: a knowledge-intensive NLP-based approach.

    Science.gov (United States)

    Ratkovic, Zorana; Golik, Wiktoria; Warnier, Pierre

    2012-06-26

    Bacteria biotopes cover a wide range of diverse habitats including animal and plant hosts, natural, medical and industrial environments. The high volume of publications in the microbiology domain provides a rich source of up-to-date information on bacteria biotopes. This information, as found in scientific articles, is expressed in natural language and is rarely available in a structured format, such as a database. This information is of great importance for fundamental research and microbiology applications (e.g., medicine, agronomy, food, bioenergy). The automatic extraction of this information from texts will provide a great benefit to the field. We present a new method for extracting relationships between bacteria and their locations using the Alvis framework. Recognition of bacteria and their locations was achieved using a pattern-based approach and domain lexical resources. For the detection of environment locations, we propose a new approach that combines lexical information and the syntactic-semantic analysis of corpus terms to overcome the incompleteness of lexical resources. Bacteria location relations extend over sentence borders, and we developed domain-specific rules for dealing with bacteria anaphors. We participated in the BioNLP 2011 Bacteria Biotope (BB) task with the Alvis system. Official evaluation results show that it achieves the best performance of participating systems. New developments since then have increased the F-score by 4.1 points. We have shown that the combination of semantic analysis and domain-adapted resources is both effective and efficient for event information extraction in the bacteria biotope domain. We plan to adapt the method to deal with a larger set of location types and a large-scale scientific article corpus to enable microbiologists to integrate and use the extracted knowledge in combination with experimental data.

  6. Application of ionic liquids based enzyme-assisted extraction of chlorogenic acid from Eucommia ulmoides leaves.

    Science.gov (United States)

    Liu, Tingting; Sui, Xiaoyu; Li, Li; Zhang, Jie; Liang, Xin; Li, Wenjing; Zhang, Honglian; Fu, Shuang

    2016-01-15

    A new approach for ionic liquid based enzyme-assisted extraction (ILEAE) of chlorogenic acid (CGA) from Eucommia ulmoides is presented in which enzyme pretreatment was used in ionic liquids aqueous media to enhance extraction yield. For this purpose, the solubility of CGA and the activity of cellulase were investigated in eight 1-alkyl-3-methylimidazolium ionic liquids. Cellulase in 0.5 M [C6mim]Br aqueous solution was found to provide better performance in extraction. The factors of ILEAE procedures including extraction time, extraction phase pH, extraction temperatures and enzyme concentrations were investigated. Moreover, the novel developed approach offered advantages in term of yield and efficiency compared with other conventional extraction techniques. Scanning electronic microscopy of plant samples indicated that cellulase treated cell wall in ionic liquid solution was subjected to extract, which led to more efficient extraction by reducing mass transfer barrier. The proposed ILEAE method would develope a continuous process for enzyme-assisted extraction including enzyme incubation and solvent extraction process. In this research, we propose a novel view for enzyme-assisted extraction of plant active component, besides concentrating on enzyme facilitated cell wall degradation, focusing on improvement of bad permeability of ionic liquids solutions.

  7. Liquid-Liquid Extraction in Systems Containing Butanol and Ionic Liquids – A Review

    Directory of Open Access Journals (Sweden)

    Kubiczek Artur

    2017-03-01

    Full Text Available Room-temperature ionic liquids (RTILs are a moderately new class of liquid substances that are characterized by a great variety of possible anion-cation combinations giving each of them different properties. For this reason, they have been termed as designer solvents and, as such, they are particularly promising for liquid-liquid extraction, which has been quite intensely studied over the last decade. This paper concentrates on the recent liquid-liquid extraction studies involving ionic liquids, yet focusing strictly on the separation of n-butanol from model aqueous solutions. Such research is undertaken mainly with the intention of facilitating biological butanol production, which is usually carried out through the ABE fermentation process. So far, various sorts of RTILs have been tested for this purpose while mostly ternary liquid-liquid systems have been investigated. The industrial design of liquid-liquid extraction requires prior knowledge of the state of thermodynamic equilibrium and its relation to the process parameters. Such knowledge can be obtained by performing a series of extraction experiments and employing a certain mathematical model to approximate the equilibrium. There are at least a few models available but this paper concentrates primarily on the NRTL equation, which has proven to be one of the most accurate tools for correlating experimental equilibrium data. Thus, all the presented studies have been selected based on the accepted modeling method. The reader is also shown how the NRTL equation can be used to model liquid-liquid systems containing more than three components as it has been the authors’ recent area of expertise.

  8. DiMeX: A Text Mining System for Mutation-Disease Association Extraction.

    Science.gov (United States)

    Mahmood, A S M Ashique; Wu, Tsung-Jung; Mazumder, Raja; Vijay-Shanker, K

    2016-01-01

    The number of published articles describing associations between mutations and diseases is increasing at a fast pace. There is a pressing need to gather such mutation-disease associations into public knowledge bases, but manual curation slows down the growth of such databases. We have addressed this problem by developing a text-mining system (DiMeX) to extract mutation to disease associations from publication abstracts. DiMeX consists of a series of natural language processing modules that preprocess input text and apply syntactic and semantic patterns to extract mutation-disease associations. DiMeX achieves high precision and recall with F-scores of 0.88, 0.91 and 0.89 when evaluated on three different datasets for mutation-disease associations. DiMeX includes a separate component that extracts mutation mentions in text and associates them with genes. This component has been also evaluated on different datasets and shown to achieve state-of-the-art performance. The results indicate that our system outperforms the existing mutation-disease association tools, addressing the low precision problems suffered by most approaches. DiMeX was applied on a large set of abstracts from Medline to extract mutation-disease associations, as well as other relevant information including patient/cohort size and population data. The results are stored in a database that can be queried and downloaded at http://biotm.cis.udel.edu/dimex/. We conclude that this high-throughput text-mining approach has the potential to significantly assist researchers and curators to enrich mutation databases.

  9. Fingerprint testing of contaminated ventilation extract filter systems at Sizewell B

    Energy Technology Data Exchange (ETDEWEB)

    Meddings, P.; Patel, S. [Nuclear Electric plc, Barnwood (United Kingdom)

    1996-06-01

    Sizewell B is Nuclear Electric`s latest power station, and the Pressurised Water Reactor (PWR) design on which it is based represents a ``first`` for the UK. One of the integral components of the plant is the heating, ventilation and air-conditioning (HVAC) system, which performs a contamination control and gaseous waste management function for the site. During the commissioning of Sizewell B Power Station the extract systems of the HVAC plant underwent a procedure known as ``fingerprinting``. This entailed the characterisation of the facilities provided to test the filtration plant during its lifetime. The assessment of their adequacy was then used to identify necessary modifications and/or to propose the manner in which future in situ performance testing would be carried out. The paper outlines the basic principles and procedure that was used to ``fingerprint`` test systems during the commissioning of Sizewell B. A specific example is presented to demonstrate the process. (UK).

  10. Extraction of tetra-oxo anions into a hydrophobic, ionic liquid-based solvent without concomitant ion exchange.

    Energy Technology Data Exchange (ETDEWEB)

    Stepinski, D. C.; Vandegrift, G. F.; Shkrob, I. A.; Wishart, J. F.; Kerr, K.; Dietz, M. L.; Qadah, D. T. D.; Garvey, S. L.; BNL; Univ. of Wisconsin at Milwaukee

    2010-06-16

    Hydrophobic ionic liquids (IL) have the potential to simplify certain separations by serving as both an extraction solvent and an electrolyte for subsequent electrochemical reductions. While IL-based solvents are known to be efficient media for metal ion extraction, separations employing these solvents are frequently complicated by the loss of constituent IL ions to the aqueous phase, resulting in deteriorating performance. In this study, we have examined the extraction of pertechnetate and related tetra-oxo anions from aqueous solutions into IL-based solvents incorporating tetraalkylphosphonium bis[(trifluoromethyl)sulfonyl]imide and a crown ether. In contrast to various previously studied IL-based cation extraction systems, facile anion extraction without significant transfer of the IL ions to the aqueous phase has been observed. In addition, the solvents exhibit high distribution ratios (100-500 for pertechnetate), significant electrical conductivity (>100 ?S/cm), and a wide (4 V) electrochemical window. The results suggest that these solvents may provide the basis for improved approaches to the extraction and recovery of a variety of anions.

  11. Extraction of Tetra-oxo Anions into a Hydrophobic, Ionic Liquid-Based Solvent Without Concomitant Ion Exchange

    Energy Technology Data Exchange (ETDEWEB)

    Stepinski, D.C.; Wishart, J.; Vandegrift, III, G.F.; Shkrob, I.A.; Kerr, K.; Dietz, M.L.; Qadah, D.T.D.; Garvey, S.L.

    2010-06-10

    Hydrophobic ionic liquids (IL) have the potential to simplify certain separations by serving as both an extraction solvent and an electrolyte for subsequent electrochemical reductions. While IL-based solvents are known to be efficient media for metal ion extraction, separations employing these solvents are frequently complicated by the loss of constituent IL ions to the aqueous phase, resulting in deteriorating performance. In this study, we have examined the extraction of pertechnetate and related tetra-oxo anions from aqueous solutions into IL-based solvents incorporating tetraalkylphosphonium bis[(trifluoromethyl)sulfonyl]imide and a crown ether. In contrast to various previously studied IL-based cation extraction systems, facile anion extraction without significant transfer of the IL ions to the aqueous phase has been observed. In addition, the solvents exhibit high distribution ratios (100-500 for pertechnetate), significant electrical conductivity (>100 {micro}S/cm), and a wide ({approx}4 V) electrochemical window. The results suggest that these solvents may provide the basis for improved approaches to the extraction and recovery of a variety of anions.

  12. Determination of trace mercury in compost extract by inhibition based glucose oxidase biosensor

    Institute of Scientific and Technical Information of China (English)

    LIU Jian-xiao; XU Xiang-min; TANG Lin; ZENG Guang-ming

    2009-01-01

    A novel inhibition based biosensor of glucose oxidase(GOx) for environmental mercury detection was developed. An electropolymerized aniline membrane was prepared on a platinum electrode containing ferrocene as electron transfer mediator, on which GOx was cross-linked by glutaraldehyde. The response of the sensor was based on the current reduction in the electrochemical system by inhibition of mercury against GOx electrode. The detection limit of the inhibition-based sensor for mercury is 0.49 μg/L, and the linear response ranges are 0.49-783.21 μg/L and 783.21 μg/L-25.55 mg/L. The GOx membrane can be completely reactivated after inhibition, and remains 70% of the activity in more than one month. The sensor was used for mercury determination in compost extract with good results.

  13. Modeling and Extraction of Parameters Based on Physical Effects in Bipolar Transistors

    Directory of Open Access Journals (Sweden)

    Agnes Nagy

    2011-01-01

    Full Text Available The rising complexity of electronic systems, the reduction of components size, and the increment of working frequencies demand every time more accurate and stable integrated circuits, which require more precise simulation programs during the design process. PSPICE, widely used to simulate the general behavior of integrated circuits, does not consider many of the physical effects that can be found in real devices. Compact models, HICUM and MEXTRAM, have been developed over recent decades, in order to eliminate this deficiency. This paper presents some of the physical aspects that have not been studied so far, such as the expression of base-emitter voltage, including the emitter emission coefficient effect (n, physical explanation and simulation procedure, as well as a new extraction method for the diffusion potential VDE(T, based on the forward biased base-emitter capacitance, showing excellent agreement between experimental and theoretical results.

  14. DEM extraction and its accuracy analysis with ground-based SAR interferometry

    Science.gov (United States)

    Dong, J.; Yue, J. P.; Li, L. H.

    2014-03-01

    Two altimetry models extracting DEM (Digital Elevation Model) with the GBSAR (Ground-Based Synthetic Aperture Radar) technology are studied and their accuracies are analyzed in detail. The approximate and improved altimetry models of GBSAR were derived from the spaceborne radar altimetry based on the principles of the GBSAR technology. The error caused by the parallel ray approximation in the approximate model was analyzed quantitatively, and the results show that the errors cannot be ignored for the ground-based radar system. For the improved altimetry model, the elevation error expression can be acquired by simulating and analyzing the error propagation coefficients of baseline length, wavelength, differential phase and range distance in the mathematical model. By analyzing the elevation error with the baseline and range distance, the results show that the improved altimetry model is suitable for high-precision DEM and the accuracy can be improved by adjusting baseline and shortening slant distance.

  15. Technology based Education System

    DEFF Research Database (Denmark)

    Kant Hiran, Kamal; Doshi, Ruchi; Henten, Anders

    2016-01-01

    Abstract - Education plays a very important role for the development of the country. Education has multiple dimensions from schooling to higher education and research. In all these domains, there is invariably a need for technology based teaching and learning tools are highly demanded in the acad...

  16. Transportation-related data bases extracted from the national index of energy and environmental data bases: Part I. Digest of detailed data base descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Birss, E.W.; Yeh, J.W.

    1976-11-15

    Lawrence Livermore Laboratory (LLL) extracted a set of 135 transportation-related data bases from a computerized national index of energy and environmental data bases. LLL had produced the national index for the Division of Biomedical and Environmental Research of the Energy Research and Development Administration (ERDA). Part 1 report, which contains a digest of the detailed transportation-related data base descriptions, is part of a LLL ongoing research contract with the Information Division of the Transportation Systems Center of the U.S. Department of Transportation. (DOT/TSC).

  17. Coastline Extraction from Aerial Images Based on Edge Detection

    Science.gov (United States)

    Paravolidakis, V.; Moirogiorgou, K.; Ragia, L.; Zervakis, M.; Synolakis, C.

    2016-06-01

    Nowadays coastline extraction and tracking of its changes become of high importance because of the climate change, global warming and rapid growth of human population. Coastal areas play a significant role for the economy of the entire region. In this paper we propose a new methodology for automatic extraction of the coastline using aerial images. A combination of a four step algorithm is used to extract the coastline in a robust and generalizable way. First, noise distortion is reduced in order to ameliorate the input data for the next processing steps. Then, the image is segmented into two regions, land and sea, through the application of a local threshold to create the binary image. The result is further processed by morphological operators with the aim that small objects are being eliminated and only the objects of interest are preserved. Finally, we perform edge detection and active contours fitting in order to extract and model the coastline. These algorithmic steps are illustrated through examples, which demonstrate the efficacy of the proposed methodology.

  18. Rule set transferability for object-based feature extraction

    NARCIS (Netherlands)

    Anders, N.S.; Seijmonsbergen, Arie C.; Bouten, Willem

    2015-01-01

    Cirques are complex landforms resulting from glacial erosion and can be used to estimate Equilibrium Line Altitudes and infer climate history. Automated extraction of cirques may help research on glacial geomorphology and climate change. Our objective was to test the transferability of an object-

  19. Apect-Based Opinion Extraction From Customer Reviews

    Directory of Open Access Journals (Sweden)

    Amani K Samha

    2014-04-01

    Full Text Available Text is the main method of communicating informatio n in the digital age. Messages, blogs, news articles, reviews, and opinionated information abounds on the Internet. People commonly purchase products online and post their opinions ab out purchased items. This feedback is displayed publicly to assist others with their purc hasing decisions, creating the need for a mechanism with which to extract and summarize usefu l information for enhancing the decision- making process. Our contribution is to improve the accuracy of extraction by combining different techniques from three major areas, namedD ata Mining, Natural Language Processing techniques and Ontologies. The proposed framework sequentially mines product’s aspects and users’ opinions, groups representative aspects by s imilarity, and generates an output summary. This paper focuses on the task of extracting produc t aspects and users’ opinions by extracting all possible aspects and opinions from reviews usin g natural language, ontology, and frequent “tag”sets. The proposed framework, when compared w ith an existing baseline model, yielded promising results.

  20. Rule set transferability for object-based feature extraction

    NARCIS (Netherlands)

    Anders, N.S.; Seijmonsbergen, Arie C.; Bouten, Willem

    2015-01-01

    Cirques are complex landforms resulting from glacial erosion and can be used to estimate Equilibrium Line Altitudes and infer climate history. Automated extraction of cirques may help research on glacial geomorphology and climate change. Our objective was to test the transferability of an

  1. [Application of micro-power system in the surgery of tooth extraction].

    Science.gov (United States)

    Kaijin, Hu; Yongfeng, Li

    2015-02-01

    Tooth extraction is a common operation in oral surgery. Traditional-extraction instruments, such as bone chisel, elevator, and bone hammer, lead to not only severe trauma but also unnecessary complications, and patients easily become nervous and apprehensive if tooth extraction is performed using these violent instruments. In recent years, with the develop- ment of minimally invasive concept and technology, various micro-power instruments have been used for tooth extraction. This innovative technology can reduce the iatrogenic trauma and complications of tooth extraction. Additionally, this technology can greatly decrease the patient's physical and mental pressure. The new equipment compensates for the deficiency of traditional tooth extraction equipment and facilitates the gradual replacement of the latter. Diverse micro-power systems have distinct strengths and weaknesses, so some auxiliary instruments are still needed during tooth extraction. This paper focuses on the various micro-power systems for tooth extraction and tries to compare the advantages and disadvantages of these systems. Selection and usage of auxiliary equipment are also introduced. Thus, this paper provides reference for the proper application of the micro-power systems in tooth extraction.

  2. System of extraction of volatiles from soil using microwave processes

    Science.gov (United States)

    Ethridge, Edwin C. (Inventor); Kaukler, William F. (Inventor)

    2013-01-01

    A device for the extraction and collection of volatiles from soil or planetary regolith. The device utilizes core drilled holes to gain access to underlying volatiles below the surface. Microwave energy beamed into the holes penetrates through the soil or regolith to heat it, and thereby produces vapor by sublimation. The device confines and transports volatiles to a cold trap for collection.

  3. Weak Signals Extraction and Imaging Analysis in Bistatic ISAR Systems Based on Stochastic Resonance%基于随机共振理论的双基ISAR弱信号提取及成像分析

    Institute of Scientific and Technical Information of China (English)

    邓冬虎; 朱小鹏; 张群; 罗迎; 李宏伟

    2012-01-01

    在双基地逆合成孔径雷达成像系统中,针对强背景噪声中的目标弱散射点探测和识别问题,基于能捕获目标强散射点的假设前提,提出了应用随机共振理论的弱散射点检测方法.首先对作解线调后的回波信号进行快时间轴变换处理,并对处理后的信号进行二次采样使其满足绝热近似条件;然后应用随机共振技术检测出代表瞬时距离差的弱周期信号频率,并通过循环检测和正负频率判断等后续处理提高系统的检测信噪比以及确定弱散射点的正确位置.最后对随机共振性能和双基ISAR成像进行了仿真.结果表明:随机共振能大幅提高弱周期信号的输出信噪比,应用在双基ISAR系统中能有效提高系统的信噪比增益和扩大雷达接收机的动态范围.%In the bistatic ISAR systems, in order to solve the problems of weak scatters detection and identification in strong noise background,the stochastic resonance(SR) was applied to extract the information of weak scatters. On the assumption that the strong scatters could be detected, the signals after de-chirp were disposed through transformation in fast time axis firstly, which made the signals satisfy the condition of adiabatic approximation; then the SR was used to detect the signal frequency that expressed the instantaneous range difference; after that, some processing were implemented to confirm the right positions of weak scatters and to improve the output signal-noise-ratio(SNR) .Numerical simulations shows that the SR can improve the SNR largely and enlarge the dynamic bound of radar receiver.

  4. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  5. An Efficient Feature Extraction Method Based on Entropy for Power Quality Disturbance

    Directory of Open Access Journals (Sweden)

    P. Kailasapathi

    2014-09-01

    Full Text Available This study explores the applicability of entropy defined as thermodynamic state variable introduced by German Physicists Rudolf clausius and also presents the concepts and application of said state variable as a measure of system disorganization. Later an entropy-based feature Analysis method for power quality disturbance analysis has been proposed. Feature extraction of a disturbed power signal provides information that helps to detect the responsible fault for power quality disturbance. A precise and faster feature extraction tool helps power engineers to monitor and maintain power disturbances more efficiently. Firstly, the decomposition coefficients are obtained by applying 10-level wavelet multi resolution analysis to the signals (normal, sag, swell, outage, harmonic and sag with harmonic and swell with harmonic generated by using the parametric equations. Secondly, a combined feature vector is obtained from standard deviation of these features after distinctive features for each signal are extracted by applying the energy, the Shannon entropy and the log-energy entropy methods to decomposition coefficients. Finally the entropy methods detect the different types of power quality disturbance.

  6. Development of a Biochar-Plant-Extract-Based Nitrification Inhibitor and Its Application in Field Conditions

    Directory of Open Access Journals (Sweden)

    Jhónatan Reyes-Escobar

    2015-10-01

    Full Text Available The global use of nitrogen (N fertilizer has increased 10-fold in the last fifty years, resulting in increased N losses via nitrate leaching to groundwater bodies or from gaseous emissions to the atmosphere. One of the biggest problems farmers face in agricultural production systems is the loss of N. In this context, novel biological nitrification inhibitors (BNI using biochar (BC as a renewable matrix to increase N use efficiency, by reducing nitrification rates, have been evaluated. The chemical and morphological characteristics of BC were analyzed and BC-BNI complexes were formulated using plant extracts from pine (Pinus radiata, eucalyptus (Eucalyptus globulus and peumo (Cryptocarya alba. In field experiments, fertilizer and treatments, based on crude plant extracts and BC-BNI complexes, were applied and the effect on nitrification was periodically monitored, and at the laboratory level, a phytotoxicity assay was performed. The biochar-peumo (BCPe complex showed the highest nitrification inhibition (66% on day 60 after application compared with the crude plant extract, suggesting that BCPe complex protects the BNI against biotic or abiotic factors, and therefore BC-BNI complexes could increase the persistence of biological nitrification inhibitors. None of the biochar complexes had toxic effect on radish plants.

  7. AN EFFICIENT APPROACH TO IMPROVE ARABIC DOCUMENTS CLUSTERING BASED ON A NEW KEYPHRASES EXTRACTION ALGORITHM

    Directory of Open Access Journals (Sweden)

    Hanane FROUD

    2013-11-01

    Full Text Available Document Clustering algorithms goal is to create clusters that are coherent internally, but clearly different from each other. The useful expressions in the documents is often accompanied by a large amount of noise that is caused by the use of unnecessary words, so it is indispensable to eliminate it and keeping just the useful information. Keyphrases extraction systems in Arabic are new phenomena. A number of Text Mining applications can use it to improve her results. The Keyphrases are defined as phrases that capture the main topics discussed in document; they offer a brief and precise summary of document content. Therefore, it can be a good solution to get rid of the existent noise from documents. In this paper, we propose a new method to solve the problem cited above especially for Arabic language documents, which is one of the most complex languages, by using a new Keyphrases extraction algorithm based on the Suffix Tree data structure (KpST. To evaluate our approach, we conduct an experimental study on Arabic Documents Clustering using the most popular approach of Hierarchical algorithms: Agglomerative Hierarchical algorithm with seven linkage techniques and a variety of distance functions and similarity measures to perform Arabic Document Clustering task. The obtained results show that our approach for extracting Keyphrases improves the clustering results.

  8. Web-based support systems

    CERN Document Server

    Yao, JingTao

    2010-01-01

    The emerging interdisciplinary study of Web-based support systems focuses on the theories, technologies and tools for the design and implementation of Web-based systems that support various human activities. This book presents the state-of-the-art in Web-based support systems (WSS). The research on WSS is multidisciplinary and focuses on supporting various human activities in different domains/fields based on computer science, information technology, and Web technology. The main goal is to take the opportunities of the Web, to meet the challenges of the Web, to extend the human physical limita

  9. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms.

    Science.gov (United States)

    Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan

    2016-04-22

    The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  10. Research of information classification and strategy intelligence extract algorithm based on military strategy hall

    Science.gov (United States)

    Chen, Lei; Li, Dehua; Yang, Jie

    2007-12-01

    Constructing virtual international strategy environment needs many kinds of information, such as economy, politic, military, diploma, culture, science, etc. So it is very important to build an information auto-extract, classification, recombination and analysis management system with high efficiency as the foundation and component of military strategy hall. This paper firstly use improved Boost algorithm to classify obtained initial information, then use a strategy intelligence extract algorithm to extract strategy intelligence from initial information to help strategist to analysis information.

  11. Feature Extraction with Ordered Mean Values for Content Based Image Classification

    Directory of Open Access Journals (Sweden)

    Sudeep Thepade

    2014-01-01

    Full Text Available Categorization of images into meaningful classes by efficient extraction of feature vectors from image datasets has been dependent on feature selection techniques. Traditionally, feature vector extraction has been carried out using different methods of image binarization done with selection of global, local, or mean threshold. This paper has proposed a novel technique for feature extraction based on ordered mean values. The proposed technique was combined with feature extraction using discrete sine transform (DST for better classification results using multitechnique fusion. The novel methodology was compared to the traditional techniques used for feature extraction for content based image classification. Three benchmark datasets, namely, Wang dataset, Oliva and Torralba (OT-Scene dataset, and Caltech dataset, were used for evaluation purpose. Performance measure after evaluation has evidently revealed the superiority of the proposed fusion technique with ordered mean values and discrete sine transform over the popular approaches of single view feature extraction methodologies for classification.

  12. Evaluation of antioxidant activity of medicinal plants containing polyphenol compounds. Comparison of two extraction systems.

    Science.gov (United States)

    Kratchanova, Maria; Denev, Petko; Ciz, Milan; Lojek, Antonin; Mihailov, Atanas

    2010-01-01

    This study investigates the influence of extraction system on the extractability of polyphenol compounds and antioxidant activity of various medicinal plants. Oxygen radical absorbance capacity (ORAC) and total polyphenol content of 25 Bulgarian medicinal plants subjected to water or 80 % acetone extractions were investigated and compared. The type of extragent significantly influenced the efficiency of the polyphenol extraction and the antioxidant activity. In all cases ORAC results and total polyphenol content were higher for acetone extraction than for water extraction. The acetone extract of peppermint had the highest ORAC value - 2917 micromol Trolox equivalent (TE)/g dry weight (DW) and polyphenol content - 20216 mg/100 g DW. For water extraction thyme exhibited the highest ORAC antioxidant activity - 1434 micromol TE/g DW. There was a significant linear correlation between the concentration of total polyphenols and ORAC in the investigated medicinal plants. It can be concluded that the solvent used affects significantly the polyphenol content and the antioxidant activity of the extract and therefore it is recommended to use more than one extraction system for better assessment of the antioxidant activity of natural products. Several of the investigated herbs contain substantial amounts of free radical scavengers and can serve as a potential source of natural antioxidants for medicinal and commercial uses.

  13. Interfacial chemistry in solvent extraction systems. Progress report, June 1, 1992--May 31, 1993

    Energy Technology Data Exchange (ETDEWEB)

    Neuman, R.D.

    1993-01-01

    Research this past year continued to emphasize characterization of the physicochemical nature of the microscopic interfaces, i.e., reversed micelles and other association microstructures, which form in both practical and simplified acidic organophosphorus extraction systems associated with Ni, Co, and Na in order to improve on the model for aggregation of metal-extractant complexes. Also, the macroscopic interfacial behavior of model extractant (surfactant) molecules was further investigated. 1 fig.

  14. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  15. Expert and Knowledge Based Systems.

    Science.gov (United States)

    Demaid, Adrian; Edwards, Lyndon

    1987-01-01

    Discusses the nature and current state of knowledge-based systems and expert systems. Describes an expert system from the viewpoints of a computer programmer and an applications expert. Addresses concerns related to materials selection and forecasts future developments in the teaching of materials engineering. (ML)

  16. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  17. 基于双重逆极限空间液压系统泄漏特征提取方法%Feature Extraction Based on Double lnverse Limit Space in Hydraulic System

    Institute of Scientific and Technical Information of China (English)

    朱丹丹; 连利纳; 张玉存

    2015-01-01

    为了检测大型锻造液压机液压系统的泄漏问题,提出了一种基于双重逆极限空间的特征分析方法。将大型锻造液压机液压系统的泄漏作为原信息空间,并在此基础上建立与之拓扑同构的双重逆极限空间。在双重逆极限空间,并通过拓扑不变性来反映大型锻造液压机液压系统的泄漏情况。最后通过仿真实验验证了此理论方法的可行性。结果表明,基于双重逆极限空间的特征提取方法更适合提取泄漏的耦合特征,所提取的特征信息对泄漏具有很好的检测和定位能力。%In order to eztract leakage information of forging hydraulic press hydraulic system,a method of feature eztraction based on double inverse limit space is proposed. The leakage information space of large forging hydraulic press hydraulic system is established,and double inverse limit space is established based on the leakage information space,and the two spaces are topological isomorphism. In a double inverse limit space,the large forging hydraulic press hydraulic system of leakage is reflected through the topological isomorphism. The feasibility of this theory is verified by simulation. The results show that ,the method of feature eztraction based on double inverse limit space is more suitable for eztracting the leakage coupling characteristics,the characteristics of information leakage eztracted by this method has very good detection and location capability.

  18. CHROMIUM EXTRACTION BY MICROEMULSIONS IN TWO- AND THREE-PHASE SYSTEMS

    Directory of Open Access Journals (Sweden)

    K. R. O. Melo

    2015-12-01

    Full Text Available Abstract Microemulsion systems were used to remove chromium from an aqueous solution obtained from acid digestion of tannery sludge. The systems were composed by: coconut oil soap as surfactant, 1-butanol as cosurfactant, kerosene as the oil phase, and chromium solution as the aqueous phase. Two- and three-phase microemulsion extraction methods were investigated in the experiments. Viscosity, effective diameter of the droplets, and extraction and re-extraction efficiencies were evaluated for each system. Two- and three-phase systems showed small variations in droplet diameter, which can be attributed to the formation of micellar structures. Chromium recovery efficiencies for the studied systems were over 96%. The re-extraction step showed that the stripping solution used can release more than 96% of the chromium from the microemulsion phase. Experimental results confirm that chromium can be recovered efficiently using microemulsion systems.

  19. Application of ionic liquid-based microwave-assisted extraction of flavonoids from Scutellaria baicalensis Georgi.

    Science.gov (United States)

    Zhang, Qin; Zhao, San-Hu; Chen, Jue; Zhang, Li-Wei

    2015-10-01

    In the present work, a rapid ionic liquid-based microwave-assisted extraction (ILMAE) method was successfully applied to simultaneous extraction of baicalin, wogonoside, baicalein and wogonin from Scutellaria baicalensis Georgi. A series of 1-alkyl-3-methylirnidazolium ionic liquids with different anions and cations were assessed for extraction efficiency, and 1-octyl-3-methylimidazolium bromide was selected as the optimal solvent. In addition, the parameters of ILMAE procedure for the four flavonoids were optimized, and the optimal ILMAE method was validated in the linearity, stability, precision and recovery. Meanwhile, the microstructures of S. baicalensis powders were observed before and after extraction with the help of a scanning electron microscope (SEM) in order to explore the extraction mechanism, and the activity of the crude enzyme solution from S. baicalensis was determined through the hydrolysis of baicalin. Finally, the extraction yields and extraction time of WaterHRE, WaterMAE, ILHRE and Chp were 5.18% (30min), 8.77% (90s), 16.94% (30min) and 18.58% (3h), respectively. The results indicated that compared with the conventional extraction approaches, ILMAE possessed great advantages in extracting flavonoids, such as the highest extraction yield (22.28%), the shortest extraction time (90s), etc.

  20. Ionic Liquid-Based Microwave-Assisted Extraction of Flavonoids from Bauhinia championii (Benth. Benth.

    Directory of Open Access Journals (Sweden)

    Wei Xu

    2012-12-01

    Full Text Available An ionic liquids (IL-based microwave-assisted approach for extraction and determination of flavonoids from Bauhinia championii (Benth. Benth. was proposed for the first time. Several ILs with different cations and anions and the microwave-assisted extraction (MAE conditions, including sample particle size, extraction time and liquid-solid ratio, were investigated. Two M 1-butyl-3-methylimidazolium bromide ([bmim] Br solution with 0.80 M HCl was selected as the optimal solvent. Meanwhile the optimized conditions a ratio of liquid to material of 30:1, and the extraction for 10 min at 70 °C. Compared with conventional heat-reflux extraction (CHRE and the regular MAE, IL-MAE exhibited a higher extraction yield and shorter extraction time (from 1.5 h to 10 min. The optimized extraction samples were analysed by LC-MS/MS. IL extracts of Bauhinia championii (Benth.Benth consisted mainly of flavonoids, among which myricetin, quercetin and kaempferol, β-sitosterol, triacontane and hexacontane were identified. The study indicated that IL-MAE was an efficient and rapid method with simple sample preparation. LC-MS/MS was also used to determine the chemical composition of the ethyl acetate/MAE extract of Bauhinia championii (Benth. Benth, and it maybe become a rapid method to determine the composition of new plant extracts.

  1. Rotation-Invariant Neural Pattern Recognition System Using Extracted Descriptive Symmetrical Patterns

    Directory of Open Access Journals (Sweden)

    YRehab F. Abdel-Kader, Rabab M. Ramadan, Fayez W. Zaki , and Emad El-Sayed1

    2012-05-01

    Full Text Available In this paper a novel rotation-invariant neural-based pattern recognition system is proposed. The system incorporates a new image preprocessing technique to extract rotation-invariant descriptive patterns from the shapes. The proposed system applies a three phase algorithm on the shape image to extract the rotation-invariant pattern. First, the orientation angle of the shape is calculated using a newly developed shape orientation technique. The technique is effective, computationally inexpensive and can be applied to shapes with several non-equally separated axes of symmetry. A simple method to calculate the average angle of the shape’s axes of symmetry is defined. In this technique, only the first moment of inertia is considered to reduce the computational cost. In the second phase, the image is rotated using a simple rotation technique to adapt its orientation angle to any specific reference angle. Finally in the third phase, the image preprocessor creates a symmetrical pattern about the axis with the calculated orientation angle and the perpendicular axis on it. Performing this operation in both the neural network training and application phases, ensures that the test rotated patterns will enter the network in the same position as in the training. Three different approaches were used to create the symmetrical patterns from the shapes. Experimental results indicate that the proposed approach is very effective and provide a recognition rate up to 99.5%.

  2. Heterogeneous Web Data Extraction Algorithm Based On Modified Hidden Conditional Random Fields

    OpenAIRE

    Cui Cheng

    2014-01-01

    As it is of great importance to extract useful information from heterogeneous Web data, in this paper, we propose a novel heterogeneous Web data extraction algorithm using a modified hidden conditional random fields model. Considering the traditional linear chain based conditional random fields can not effectively solve the problem of complex and heterogeneous Web data extraction, we modify the standard hidden conditional random fields in three aspects, which are 1) Using the hidden Markov mo...

  3. Fetal ECG extraction via Type-2 adaptive neuro-fuzzy inference systems.

    Science.gov (United States)

    Ahmadieh, Hajar; Asl, Babak Mohammadzadeh

    2017-04-01

    We proposed a noninvasive method for separating the fetal ECG (FECG) from maternal ECG (MECG) by using Type-2 adaptive neuro-fuzzy inference systems. The method can extract FECG components from abdominal signal by using one abdominal channel, including maternal and fetal cardiac signals and other environmental noise signals, and one chest channel. The proposed algorithm detects the nonlinear dynamics of the mother's body. So, the components of the MECG are estimated from the abdominal signal. By subtracting estimated mother cardiac signal from abdominal signal, fetal cardiac signal can be extracted. This algorithm was applied on synthetic ECG signals generated based on the models developed by McSharry et al. and Behar et al. and also on DaISy real database. In environments with high uncertainty, our method performs better than the Type-1 fuzzy method. Specifically, in evaluation of the algorithm with the synthetic data based on McSharry model, for input signals with SNR of -5dB, the SNR of the extracted FECG was improved by 38.38% in comparison with the Type-1 fuzzy method. Also, the results show that increasing the uncertainty or decreasing the input SNR leads to increasing the percentage of the improvement in SNR of the extracted FECG. For instance, when the SNR of the input signal decreases to -30dB, our proposed algorithm improves the SNR of the extracted FECG by 71.06% with respect to the Type-1 fuzzy method. The same results were obtained on synthetic data based on Behar model. Our results on real database reflect the success of the proposed method to separate the maternal and fetal heart signals even if their waves overlap in time. Moreover, the proposed algorithm was applied to the simulated fetal ECG with ectopic beats and achieved good results in separating FECG from MECG. The results show the superiority of the proposed Type-2 neuro-fuzzy inference method over the Type-1 neuro-fuzzy inference and the polynomial networks methods, which is due to its

  4. Generating Fuzzy Rule-based Systems from Examples Based on Robust Support Vector Machine

    Institute of Scientific and Technical Information of China (English)

    JIA Jiong; ZHANG Hao-ran

    2006-01-01

    This paper firstly proposes a new support vector machine regression (SVR) with a robust loss function, and designs a gradient based algorithm for implementation of the SVR,then uses the SVR to extract fuzzy rules and designs fuzzy rule-based system. Simulations show that fuzzy rule-based system technique based on robust SVR achieves superior performance to the conventional fuzzy inference method, the proposed method provides satisfactory performance with excellent approximation and generalization property than the existing algorithm.

  5. HTML Extraction Algorithm Based on Property and Data Cell

    Science.gov (United States)

    Purnamasari, Detty; Wayan Simri Wicaksana, I.; Harmanto, Suryadi; Yuniar Banowosari, Lintang

    2013-06-01

    The data available on the Internet is in various models and formats. One form of data representation is a table. Tables extraction is used in process more than one table on the Internet from different sources. Currently the effort is done by using copy-paste that is not automatic process. This article presents an approach to prepare the area, so tables in HTML format can be extracted and converted into a database that make easier to combine the data from many resources. This article was tested on the algorithm 1 used to determine the actual number of columns and rows of the table, as well as algorithm 2 are used to determine the boundary line of the property. Tests conducted at 100 tabular HTML format, and the test results provide the accuracy of the algorithm 1 is 99.9% and the accuracy of the algorithm 2 is 84%.

  6. Extracting Coherent Information from Noise Based Correlation Processing

    Science.gov (United States)

    2015-09-30

    LONG-TERM GOALS The goal of this research is to establish methodologies to utilize ambient noise in the ocean and to determine what scenarios...None PUBLICATIONS [1] “ Monitoring deep-ocean temperatures using acoustic ambinet noise,”K. W. Woolfe, S. Lani, K.G. Sabra, W. A. Kuperman...Geophys. Res. Lett., 42,2878–2884, doi:10.1002/2015GL063438 (2015). [2] “Optimized extraction of coherent arrivals from ambient noise correlations in

  7. Apect-Based Opinion Extraction From Customer Reviews

    OpenAIRE

    Samha, Amani K; Yuefeng Li; Jinglan Zhang

    2014-01-01

    Text is the main method of communicating informatio n in the digital age. Messages, blogs, news articles, reviews, and opinionated information abounds on the Internet. People commonly purchase products online and post their opinions ab out purchased items. This feedback is displayed publicly to assist others with their purc hasing decisions, creating the need for a mechanism with which to extract and summarize usefu l ...

  8. Design of triode extraction system for a dual hollow cathode ion source

    Institute of Scientific and Technical Information of China (English)

    WANG Jing-Hui; ZHU Kun; ZHAO Wei-Jiang; LIU Ke-Xin

    2011-01-01

    A triode extraction system is designed for a dual hollow cathode ion source being developed at the Institute of Heavy Ion Physics,Peking University.Basic parameters of the plasma are selected after examining the operation principle of the ion source,then the triode extraction system is designed and optimized by using software PBGUNS (for Particle Beam GUN Simulations).The physical design of the system is given in this paper.

  9. Extracting information masked by the chaotic signal of a time-delay system.

    Science.gov (United States)

    Ponomarenko, V I; Prokhorov, M D

    2002-08-01

    We further develop the method proposed by Bezruchko et al. [Phys. Rev. E 64, 056216 (2001)] for the estimation of the parameters of time-delay systems from time series. Using this method we demonstrate a possibility of message extraction for a communication system with nonlinear mixing of information signal and chaotic signal of the time-delay system. The message extraction procedure is illustrated using both numerical and experimental data and different kinds of information signals.

  10. Extraction of Coconut Oil (Cocos nucifera L.) through Fermentation System

    OpenAIRE

    RITA DWI RAHAYU; JOKO SULISTYO; RINI HANDAYANI

    2009-01-01

    Coconut oil (Cocos nucifera L.) has a unique role in the diet as an important physiologically functional food. The health and nutritional benefits that can be derived from consuming coconut oil have been recognized in many parts of the world for centuries. There are few techniques for coconut oil extraction, such as physical, chemical, and fermentation or enzymatic processes using microbial inoculum as enzymatic starter. Starter with different concentration (1.0; 2.5; 5.0; and 10%) of microbi...

  11. Robust breathing signal extraction from cone beam CT projections based on adaptive and global optimization techniques

    Science.gov (United States)

    Chao, Ming; Wei, Jie; Li, Tianfang; Yuan, Yading; Rosenzweig, Kenneth E.; Lo, Yeh-Chi

    2016-04-01

    We present a study of extracting respiratory signals from cone beam computed tomography (CBCT) projections within the framework of the Amsterdam Shroud (AS) technique. Acquired prior to the radiotherapy treatment, CBCT projections were preprocessed for contrast enhancement by converting the original intensity images to attenuation images with which the AS image was created. An adaptive robust z-normalization filtering was applied to further augment the weak oscillating structures locally. From the enhanced AS image, the respiratory signal was extracted using a two-step optimization approach to effectively reveal the large-scale regularity of the breathing signals. CBCT projection images from five patients acquired with the Varian Onboard Imager on the Clinac iX System Linear Accelerator (Varian Medical Systems, Palo Alto, CA) were employed to assess the proposed technique. Stable breathing signals can be reliably extracted using the proposed algorithm. Reference waveforms obtained using an air bellows belt (Philips Medical Systems, Cleveland, OH) were exported and compared to those with the AS based signals. The average errors for the enrolled patients between the estimated breath per minute (bpm) and the reference waveform bpm can be as low as  -0.07 with the standard deviation 1.58. The new algorithm outperformed the original AS technique for all patients by 8.5% to 30%. The impact of gantry rotation on the breathing signal was assessed with data acquired with a Quasar phantom (Modus Medical Devices Inc., London, Canada) and found to be minimal on the signal frequency. The new technique developed in this work will provide a practical solution to rendering markerless breathing signal using the CBCT projections for thoracic and abdominal patients.

  12. [Genetic effects of root extracts of Glycyrrhiza glabra L. on different test-systems].

    Science.gov (United States)

    Agabeĭli, R A

    2012-01-01

    The antimutagenic and geroprotective activities of root extracts of Glycyrrhiza glabra have been demonstrated both on plant test systems--Allium fistulosum L., Allium cepa L., Vicia faba L. and on animals--Vistar rats. The possibilities of the mobilization of Glycyrrhiza glabra root extracts as antimutagenic agents are discussed.

  13. Oil extraction from Scenedesmus obliquus using a continuous microwave system--design, optimization, and quality characterization.

    Science.gov (United States)

    Balasubramanian, Sundar; Allen, James D; Kanitkar, Akanksha; Boldor, Dorin

    2011-02-01

    A 1.2 kW, 2450 MHz resonant continuous microwave processing system was designed and optimized for oil extraction from green algae (Scenedesmus obliquus). Algae-water suspension (1:1 w/w) was heated to 80 and 95°C, and subjected to extraction for up to 30 min. Maximum oil yield was achieved at 95°C and 30 min. The microwave system extracted 76-77% of total recoverable oil at 20-30 min and 95°C, compared to only 43-47% for water bath control. Extraction time and temperature had significant influence (pextraction yield. Oil analysis indicated that microwaves extracted oil containing higher percentages of unsaturated and essential fatty acids (indicating higher quality). This study validates for the first time the efficiency of a continuous microwave system for extraction of lipids from algae. Higher oil yields, faster extraction rates and superior oil quality demonstrate this system's feasibility for oil extraction from a variety of feedstock.

  14. A Temporal Abstraction-based Extract, Transform and Load Process for Creating Registry Databases for Research.

    Science.gov (United States)

    Post, Andrew; Kurc, Tahsin; Overcash, Marc; Cantrell, Dedra; Morris, Tim; Eckerson, Kristi; Tsui, Circe; Willey, Terry; Quyyumi, Arshed; Eapen, Danny; Umpierrez, Guillermo; Ziemer, David; Saltz, Joel

    2011-01-01

    In the CTSA era there is great interest in aggregating and comparing populations across institutions. These sites likely represent data differently in their clinical data warehouses and other databases. Clinical data warehouses frequently are structured in a generalized way that supports many constituencies. For research, there is a need to transform these heterogeneous data into a shared representation, and to perform categorization and interpretation to optimize the data representation for investigators. We are addressing this need by extending an existing temporal abstraction-based clinical database query system, PROTEMPA. The extended system allows specifying data types of interest in federated databases, extracting the data into a shared representation, transforming it through categorization and interpretation, and loading it into a registry database that can be refreshed. Such a registry's access control, data representation and query tools can be tailored to the needs of research while keeping local databases as the source of truth.

  15. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  16. A Novel Feature Selection Strategy for Enhanced Biomedical Event Extraction Using the Turku System

    Directory of Open Access Journals (Sweden)

    Jingbo Xia

    2014-01-01

    Full Text Available Feature selection is of paramount importance for text-mining classifiers with high-dimensional features. The Turku Event Extraction System (TEES is the best performing tool in the GENIA BioNLP 2009/2011 shared tasks, which relies heavily on high-dimensional features. This paper describes research which, based on an implementation of an accumulated effect evaluation (AEE algorithm applying the greedy search strategy, analyses the contribution of every single feature class in TEES with a view to identify important features and modify the feature set accordingly. With an updated feature set, a new system is acquired with enhanced performance which achieves an increased F-score of 53.27% up from 51.21% for Task 1 under strict evaluation criteria and 57.24% according to the approximate span and recursive criterion.

  17. Extraction of the human cerebral ventricular system from MRI: inclusion of anatomical knowledge and clinical perspective

    Science.gov (United States)

    Aziz, Aamer; Hu, Qingmao; Nowinski, Wieslaw L.

    2004-04-01

    The human cerebral ventricular system is a complex structure that is essential for the well being and changes in which reflect disease. It is clinically imperative that the ventricular system be studied in details. For this reason computer assisted algorithms are essential to be developed. We have developed a novel (patent pending) and robust anatomical knowledge-driven algorithm for automatic extraction of the cerebral ventricular system from MRI. The algorithm is not only unique in its image processing aspect but also incorporates knowledge of neuroanatomy, radiological properties, and variability of the ventricular system. The ventricular system is divided into six 3D regions based on the anatomy and its variability. Within each ventricular region a 2D region of interest (ROI) is defined and is then further subdivided into sub-regions. Various strict conditions that detect and prevent leakage into the extra-ventricular space are specified for each sub-region based on anatomical knowledge. Each ROI is processed to calculate its local statistics, local intensity ranges of cerebrospinal fluid and grey and white matters, set a seed point within the ROI, grow region directionally in 3D, check anti-leakage conditions and correct growing if leakage occurs and connects all unconnected regions grown by relaxing growing conditions. The algorithm was tested qualitatively and quantitatively on normal and pathological MRI cases and worked well. In this paper we discuss in more detail inclusion of anatomical knowledge in the algorithm and usefulness of our approach from clinical perspective.

  18. 基于实时数据库的采油厂数采监控系统的设计%Design of Oil Extraction Factory Production Data Acquisition System Based on Real-time Database

    Institute of Scientific and Technical Information of China (English)

    施国俊

    2012-01-01

    针对采油厂的实际情况,提出了基于实时数据库的采油厂数采监控系统的设计.介绍了系统的总体架构及数采前置机和监控中心的设计.井口数据采集到RTU通过无线网远传到监控中心,集油阀组间、配水间和注配间的数据采集到RTU后通过网络远传到监控中心.站数采前置机通过Remote I/O对各站PLC进行数采,将数据远传到监控中心,监控中心对数据进行存储、处理、转储和发布,授权人员通过Web方式对现场设备进行远程控制.本系统已成功应用于大庆油田第八采油厂,系统应用后改变了原有的管理方式,加强了对生产数据的管理,缩短了信息沟通的时间,提高了工作效率.%In view of the actual situation of production,and put forward the design of monitoring and control system of oil production based on real-time database.Introduction of system structure and framework of the overall number of mining front-end and the design of the monitoring center.The data collected by far to the factory department RTU wireless network monitoring center,set the oil between groups,with water and note match between the data collected between RTU through the network to the factory department after far monitoring center.Stand by front-end through several remote I/O to PLC for several mining stations,the data is far to the factory department the monitoring center,the monitoring center for data storage,processing,dump and release,authorized personnel through the Web form to the scene by remote control equipment.This system has been successfully used in the eighth oil production plant of Daqing oilfield,system application enhanced the production data management,change the original management mode,shortening the communication time,improve work efficiency.

  19. ARM Cortex M0的油田采输物联网监控系统设计%Design of IoT Monitoring System for Oil Extraction and Transportation Based on ARM Cortex-M0

    Institute of Scientific and Technical Information of China (English)

    于玉珠; 殷春莉

    2014-01-01

    为了提升油田生产的自动化程度,保障安全,降低运行成本,扩大对环境恶劣和偏远地区的自动化覆盖范围,设计一种油田采输物联网监控系统。该系统以基于 ARM Cortex M0的 LPC11C14FBD48为主控芯片,M35为 GPRS 通信模块,用 C++语言开发内核软件和远程集中监控中心软件。可远程对油田采输现场的数据进行采集和系统控制,同时与云计算服务中心实现数据交互,为大数据处理提供基础。该系统实现了油田采输无人值守,覆盖偏远区域,降低了运行成本,提升了油田生产的安全性、可靠性和高效性,使油田生产综合效益明显提高。%In order to raise the automation degree and security in oil field production,reduce the running cost and improve the automated range of poor environment and remote areas,a kind of IoT monitoring control system for oil production is designed.The system takes LPC1 1C14FBD48 based on ARM Cortex-M0 as main control chip and M35 as GPRS communication module.C++ is used to develop kernel software and the software of remote centralized monitoring control center.Remote data acquisition and system control of oil acqui-sition areas are realized,and data interactions with the cloud computing service center are completed so as to provide basic data for big data processing.The system can be applied in poor environment and remote areas.It not only saves the operation cost,but also improves safety,reliability and efficiency,so that it increases obviously comprehensive benefits of oil field production.

  20. HARMONIC COMPONENT EXTRACTION FROM A CHAOTIC SIGNAL BASED ON EMPIRICAL MODE DECOMPOSITION METHOD

    Institute of Scientific and Technical Information of China (English)

    LI Hong-guang; MENG Guang

    2006-01-01

    A novel approach of signal extraction of a harmonic component from a chaotic signal generated by a Duffing oscillator was proposed. Based on empirical mode decomposition (EMD) and concept that any signal is composed of a series of the simple intrinsic modes, the harmonic components were extracted from the chaotic signals. Simulation results show the approach is satisfactory.

  1. The Automatic Generation of Chinese Outline Font Based on Stroke Extraction

    Institute of Scientific and Technical Information of China (English)

    1995-01-01

    A new method to obtain spline outline description of Chinese font based on stroke extraction is presented.It has two primary advantages:(1)the quality of Chinese output is greatly improved;(2)the memory requirement is reduced.The method for stroke extraction is discussed in detail and experimental results are presented.

  2. Edge-Based Feature Extraction Method and Its Application to Image Retrieval

    Directory of Open Access Journals (Sweden)

    G. Ohashi

    2003-10-01

    Full Text Available We propose a novel feature extraction method for content-bases image retrieval using graphical rough sketches. The proposed method extracts features based on the shape and texture of objects. This edge-based feature extraction method functions by representing the relative positional relationship between edge pixels, and has the advantage of being shift-, scale-, and rotation-invariant. In order to verify its effectiveness, we applied the proposed method to 1,650 images obtained from the Hamamatsu-city Museum of Musical Instruments and 5,500 images obtained from Corel Photo Gallery. The results verified that the proposed method is an effective tool for achieving accurate retrieval.

  3. Spectrum based feature extraction using spectrum intensity ratio for SSVEP detection.

    Science.gov (United States)

    Itai, Akitoshi; Funase, Arao

    2012-01-01

    Recent years, a Steady-State Visual Evoked Potential (SSVEP) is used as a basis for Brain Computer Interface (BCI)[1]. Various feature extraction and classification techniques are proposed to achieve BCI based on SSVEP. The feature extraction of SSVEP is developed in the frequency domain regardless of the limitation in flickering frequency of visual stimulus caused by hardware architecture. We introduce here the feature extraction using a spectrum intensity ratio. Results show that the detection ratio reaches 84% by using a spectrum intensity ratio with unsupervised classification. It also indicates the SSVEP is enhanced by proposed feature extraction with second harmonic.

  4. Interfacial tension based on-chip extraction of microparticles confined in microfluidic Stokes flows

    Science.gov (United States)

    Huang, Haishui; He, Xiaoming

    2014-10-01

    Microfluidics involving two immiscible fluids (oil and water) has been increasingly used to produce hydrogel microparticles with wide applications. However, it is difficult to extract the microparticles out of the microfluidic Stokes flows of oil that have a Reynolds number (the ratio of inertia to viscous force) much less than one, where the dominant viscous force tends to drive the microparticles to move together with the surrounding oil. Here, we present a passive method for extracting hydrogel microparticles in microfluidic Stokes flow from oil into aqueous extracting solution on-chip by utilizing the intrinsic interfacial tension between oil and the microparticles. We further reveal that the thickness of an "extended confining layer" of oil next to the interface between oil and aqueous extracting solution must be smaller than the radius of microparticles for effective extraction. This method uses a simple planar merging microchannel design that can be readily fabricated and further integrated into a fluidic system to extract microparticles for wide applications.

  5. Improved data extraction procedures for IUE Low Resolution Spectra The INES System

    CERN Document Server

    Rodríguez-Pascual, P M; Schartel, N; Wamsteker, W

    1999-01-01

    We present the extraction and processing of the IUE Low Dispersion spectra within the framework of the ESA ``IUE Newly Extracted Spectra'' (INES) System. Weak points of SWET, the optimal extraction implementation to produce the NEWSIPS output products (extracted spectra) are discussed, and the procedures implemented in INES to solve these problems are outlined. The more relevant modifications are: 1) the use of a new noise model, 2) a more accurate representation of the spatial profile of the spectrum and 3) a more reliable determination of the background. The INES extraction also includes a correction for the contamination by solar light in long wavelength spectra. Examples showing the improvements obtained in INES with respect to SWET are described. Finally, the linearity and repeatability characteristics of INES data are evaluated and the validity of the errors provided in the extraction is discussed.

  6. Development of Ingredients of the Feed-stuff for Improving Immune system using Centipede grass Extracts

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Hyoungwoo; Chung, Byungyeoup; Lee, Seungsik; Lee, Sungbeom

    2013-09-15

    The purpose of the this project provides new application areas using naturally occurring flavonoids, cenetpedegrass extracts, for improving immune system and used as ingredients for feed-stuff. In order to provide the immune improving effects of centipedegrass, cell and animal experiments were carried out. Research scope includes determine the effect of centipedegrass extracts on immune functions using LPS-induced RAW cells and found that cytokines, IL-6 and IL-10, which were induced by LPS, were reduced by inhibiting phosphorylation of STAT-3, determine the effects of immune stimulating activity of centipedegrass in animals, cenetipedegrass extracts were administrated once a day for 2 weeks. After treated with LPS, immune suppressor, cytokines were down regulated, however, the cytokines in the group pretreated with centipedegrass extracts, were not down regulated as much as non treated group. The overall mechanism of immune stimulating effect of centipedegrass extracts, was that STAT-3 phosphorylation was inhibited by contipedegrass extracts.

  7. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Fish Recognition Based on Robust Features Extraction from Size and Shape Measurements Using Neural Network

    Directory of Open Access Journals (Sweden)

    Mutasem K. Alsmadi

    2010-01-01

    Full Text Available Problem statement: Image recognition is a challenging problem researchers had been research into this area for so long especially in the recent years, due to distortion, noise, segmentation errors, overlap and occlusion of objects in digital images. In our study, there are many fields concern with pattern recognition, for example, fingerprint verification, face recognition, iris discrimination, chromosome shape discrimination, optical character recognition, texture discrimination and speech recognition, the subject of pattern recognition appears. A system for recognizing isolated pattern of interest may be as an approach for dealing with such application. Scientists and engineers with interests in image processing and pattern recognition have developed various approaches to deal with digital image recognition problems such as, neural network, contour matching and statistics. Approach: In this study, our aim was to recognize an isolated pattern of interest in the image based on the combination between robust features extraction. Where depend on size and shape measurements, that were extracted by measuring the distance and geometrical measurements. Results: We presented a system prototype for dealing with such problem. The system started by acquiring an image containing pattern of fish, then the image features extraction is performed relying on size and shape measurements. Our system has been applied on 20 different fish families, each family has a different number of fish types and our sample consists of distinct 350 of fish images. These images were divided into two datasets: 257 training images and 93 testing images. An overall accuracy was obtained using the neural network associated with the back-propagation algorithm was 86% on the test dataset used. Conclusion: We developed a classifier for fish images recognition. We efficiently have chosen a features extraction method to fit our demands. Our classifier successfully design and implement a

  9. Automated concept and relationship extraction for the semi-automated ontology management (SEAM) system.

    Science.gov (United States)

    Doing-Harris, Kristina; Livnat, Yarden; Meystre, Stephane

    2015-01-01

    We develop medical-specialty specific ontologies that contain the settled science and common term usage. We leverage current practices in information and relationship extraction to streamline the ontology development process. Our system combines different text types with information and relationship extraction techniques in a low overhead modifiable system. Our SEmi-Automated ontology Maintenance (SEAM) system features a natural language processing pipeline for information extraction. Synonym and hierarchical groups are identified using corpus-based semantics and lexico-syntactic patterns. The semantic vectors we use are term frequency by inverse document frequency and context vectors. Clinical documents contain the terms we want in an ontology. They also contain idiosyncratic usage and are unlikely to contain the linguistic constructs associated with synonym and hierarchy identification. By including both clinical and biomedical texts, SEAM can recommend terms from those appearing in both document types. The set of recommended terms is then used to filter the synonyms and hierarchical relationships extracted from the biomedical corpus. We demonstrate the generality of the system across three use cases: ontologies for acute changes in mental status, Medically Unexplained Syndromes, and echocardiogram summary statements. Across the three uses cases, we held the number of recommended terms relatively constant by changing SEAM's parameters. Experts seem to find more than 300 recommended terms to be overwhelming. The approval rate of recommended terms increased as the number and specificity of clinical documents in the corpus increased. It was 60% when there were 199 clinical documents that were not specific to the ontology domain and 90% when there were 2879 documents very specific to the target domain. We found that fewer than 100 recommended synonym groups were also preferred. Approval rates for synonym recommendations remained low varying from 43% to 25% as the

  10. Subpixel accuracy for extracting groove center based on corner detection

    Institute of Scientific and Technical Information of China (English)

    Liu Suyi; Wang Guorong; Shi Yonghua

    2006-01-01

    Subpixel accuracy for V-groove center in robot welding is researched and a software measure to increase the accuracy of seam tracking by laser is presented.LOG( Laplacian of Gaussian ) operator is adopted to detect image edge.Vgroove center is extracted by corner detection of extremum curvature.Subpixel position is obtained by Lagarange polynomial interpolation algorithm.Experiment results show that the method is brief and applied, and is sufficient for the real time of robot welding by laser sensors.

  11. Hardware Prototyping of Neural Network based Fetal Electrocardiogram Extraction

    Science.gov (United States)

    Hasan, M. A.; Reaz, M. B. I.

    2012-01-01

    The aim of this paper is to model the algorithm for Fetal ECG (FECG) extraction from composite abdominal ECG (AECG) using VHDL (Very High Speed Integrated Circuit Hardware Description Language) for FPGA (Field Programmable Gate Array) implementation. Artificial Neural Network that provides efficient and effective ways of separating FECG signal from composite AECG signal has been designed. The proposed method gives an accuracy of 93.7% for R-peak detection in FHR monitoring. The designed VHDL model is synthesized and fitted into Altera's Stratix II EP2S15F484C3 using the Quartus II version 8.0 Web Edition for FPGA implementation.

  12. Sustainability Assessment of Urban Economic System Based on Extractive Petroleum Industry:A Case Study of Daqing City%大庆市城市经济系统可持续性评价

    Institute of Scientific and Technical Information of China (English)

    苏飞; 张平宇

    2009-01-01

    The triangle model, as an intuitive platform for illustrating sustainability status and trends in economic development, seems to hold promise as an analytical management tool liable to be used by researchers and policy-makers at different levels due to its simplicity and flexibility. Based on the interrelationships among economic development, resource-energy consumption and environmental pollution, in conjunction with ecological performance and the triangle method, a definition for economic system sustainability (ESS) was proposed, and a novel triangle method was designed to evaluate economic system sustainability. This article selected Daqing city, an important industrial base of petroleum chemical industry in China, as a research region.As a case study, the triangle method was applied to assess the economic system sustainability of Daqing city. Daqing city is located in the western of Heilongjiang province, and covers a total area of 2.1219 million hm~2 with four counties and five districts under its administration. It had a total population of 2.693 million in 2006. The results show that the economic system sustainability of Daqing during 1991 - 2006 had a poor performance. The ESS of Daqing between 1991 and 1995 reveals a common sustainable trend, while that from 1996 to 2000 demonstrates a relatively strong sustainable trend, and has a common sustainable trend during 2001 - 2006.%三角模型作为一种直观的平台能够很好地图解区域可持续综合发展状况及长期趋势.由于具有直观性和简明性,三角模型易于被不同层次的学者和政策制定者理解和应用.根据经济发展、资源-能源消耗及环境污染的相互关系,结合生态学表现,利用三角模型工具分析评估区域经济可持续发展状况和长期趋势.本文选择我国最大的石油资源型城市--大庆市作为典型案例.大庆市位于黑龙江省西部,下辖5个区和4个县,2006年全市土地总面积为2.1219万hm~2,总人口269

  13. Extracting long basic sequences from systems of dispersed vectors

    CERN Document Server

    Talponen, Jarno

    2010-01-01

    We study Banach spaces satisfying some geometric or structural properties involving tightness of transfinite sequences of nested linear subspaces. These properties are much weaker than WCG and closely related to Corson's property (C). Given a transfinite sequence of normalized vectors, which is dispersed or null in some sense, we extract a subsequence which is a biorthogonal sequence, or even a weakly null monotone basic sequence, depending on the setting. The Separable Complementation Property is established for spaces with an M-basis under rather weak geometric properties. We also consider an analogy of the Baire Category Theorem for the lattice of closed linear subspaces.

  14. Evaluation of DNA Extraction Methods Suitable for PCR-based Detection and Genotyping of Clostridium botulinum

    DEFF Research Database (Denmark)

    Auricchio, Bruna; Anniballi, Fabrizio; Fiore, Alfonsina

    2013-01-01

    Sufficient quality and quantity of extracted DNA is critical to detecting and performing genotyping of Clostridium botulinum by means of PCR-based methods. An ideal extraction method has to optimize DNA yield, minimize DNA degradation, allow multiple samples to be extracted, and be efficient...... in terms of cost, time, labor, and supplies. Eleven botulinum toxin–producing clostridia strains and 25 samples (10 food, 13 clinical, and 2 environmental samples) naturally contaminated with botulinum toxin–producing clostridia were used to compare 4 DNA extraction procedures: Chelex® 100 matrix, Phenol......-Cloroform-Isoamyl alcohol, NucliSENS® magnetic extraction kit, and DNeasy® Blood & Tissue kit. Integrity, purity, and amount of amplifiable DNA were evaluated. The results show that the DNeasy® Blood & Tissue kit is the best extraction method evaluated because it provided the most pure, intact, and amplifiable DNA. However...

  15. Fusion of Pixel-based and Object-based Features for Road Centerline Extraction from High-resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    CAO Yungang

    2016-10-01

    Full Text Available A novel approach for road centerline extraction from high spatial resolution satellite imagery is proposed by fusing both pixel-based and object-based features. Firstly, texture and shape features are extracted at the pixel level, and spectral features are extracted at the object level based on multi-scale image segmentation maps. Then, extracted multiple features are utilized in the fusion framework of Dempster-Shafer evidence theory to roughly identify the road network regions. Finally, an automatic noise removing algorithm combined with the tensor voting strategy is presented to accurately extract the road centerline. Experimental results using high-resolution satellite imageries with different scenes and spatial resolutions showed that the proposed approach compared favorably with the traditional methods, particularly in the aspect of eliminating the salt noise and conglutination phenomenon.

  16. Extracting DEM from airborne X-band data based on PolInSAR

    Science.gov (United States)

    Hou, X. X.; Huang, G. M.; Zhao, Z.

    2015-06-01

    Polarimetric Interferometric Synthetic Aperture Radar (PolInSAR) is a new trend of SAR remote sensing technology which combined polarized multichannel information and Interferometric information. It is of great significance for extracting DEM in some regions with low precision of DEM such as vegetation coverage area and building concentrated area. In this paper we describe our experiments with high-resolution X-band full Polarimetric SAR data acquired by a dual-baseline interferometric airborne SAR system over an area of Danling in southern China. Pauli algorithm is used to generate the double polarimetric interferometry data, Singular Value Decomposition (SVD), Numerical Radius (NR) and Phase diversity (PD) methods are used to generate the full polarimetric interferometry data. Then we can make use of the polarimetric interferometric information to extract DEM with processing of pre filtering , image registration, image resampling, coherence optimization, multilook processing, flat-earth removal, interferogram filtering, phase unwrapping, parameter calibration, height derivation and geo-coding. The processing system named SARPlore has been exploited based on VC++ led by Chinese Academy of Surveying and Mapping. Finally compared optimization results with the single polarimetric interferometry, it has been observed that optimization ways can reduce the interferometric noise and the phase unwrapping residuals, and improve the precision of DEM. The result of full polarimetric interferometry is better than double polarimetric interferometry. Meanwhile, in different terrain, the result of full polarimetric interferometry will have a different degree of increase.

  17. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  18. TESVE model for design of soil vapor extraction systems with thermal enhancement

    Energy Technology Data Exchange (ETDEWEB)

    Ghuman, A. [Lowney Associates, Mountain View, CA (United States); Wong, K. [Air Force, McClellan AFB, CA (United States); Singh, S. [URS Consultants, Inc., Sacramento, CA (United States)

    1994-12-31

    Soil vapor extraction (SVE) is a popular and effective technology for removal of volatile organic compounds (VOCs), from the subsurface soils. The performance of SVE systems is based on three key parameters: the rate of mass removal, the time required to achieve cleanup goals, and the cost of cleanup. These performance parameters depend on physical and chemical factors such as the rate and pattern of air flow through the affected soils, contaminant type, and the degree of partitioning between the vapor-, liquid-, dissolved- and adsorbed- phase. The effectiveness of SVE can be enhanced by raising the soil temperature. This is done using various methods including electrical heating, and hot air volatilization. TESVE (Thermally-Enhanced Soil Vapor Extraction), a multi-component, non-isothermal, three dimensional software model, is a powerful tool in evaluating the feasibility of SVE, optimizing design, predicting performance, and, ultimately reducing cleanup costs. The TESVE model was run for a SVE site at McClellan Air Force Base, California. Four SVE design scenarios were modeled for removal of trichloroethylene (TCE) from the subsurface soil.

  19. Neutralization of local and systemic toxicity of Daboia russelii venom by Morus alba plant leaf extract.

    Science.gov (United States)

    Chandrashekara, K T; Nagaraju, S; Nandini, S Usha; Kemparaju, K

    2009-08-01

    Antivenom therapy is the current best therapy available for the treatment of fatal snake envenomation. However, the antivenom offers less or no protection against local effects such as extensive edema, hemorrhage, dermo-, myonecrosis and inflammation at the envenomed region. Viperidae snakes are highly known for their violent local effects and such effects have been commonly treated with plant extracts without any scientific validation in rural India. In this investigation Morus alba plant leaf extract has been studied against the Indian Vipera/Daboia russelii venom induced local and systemic effects. The extract completely abolished the in vitro proteolytic and hyaluronolytic activities of the venom. Edema, hemorrhage and myonecrotic activities were also neutralized efficiently. In addition, the extract partially inhibited the pro-coagulant activity and completely abolished the degradation of Aalpha chain of human fibrinogen. Thus, the extract processes potent antisnake venom property, especially against the local and systemic effects of Daboia russelii venom.

  20. Simulation of H- ion source extraction systems for the Spallation Neutron Source with Ion Beam Simulator.

    Science.gov (United States)

    Kalvas, T; Welton, R F; Tarvainen, O; Han, B X; Stockli, M P

    2012-02-01

    A three-dimensional ion optical code IBSimu, which is being developed at the University of Jyväskylä, features positive and negative ion plasma extraction models and self-consistent space charge calculation. The code has been utilized for modeling the existing extraction system of the H(-) ion source of the Spallation Neutron Source. Simulation results are in good agreement with experimental data. A high-current extraction system with downstream electron dumping at intermediate energy has been designed. According to the simulations it provides lower emittance compared to the baseline system at H(-) currents exceeding 40 mA. A magnetic low energy beam transport section consisting of two solenoids has been designed to transport the beam from the alternative electrostatic extraction systems to the radio frequency quadrupole.