WorldWideScience

Sample records for rule extraction methods

  1. Portable Rule Extraction Method for Neural Network Decisions Reasoning

    Directory of Open Access Journals (Sweden)

    Darius PLIKYNAS

    2005-08-01

    Full Text Available Neural network (NN methods are sometimes useless in practical applications, because they are not properly tailored to the particular market's needs. We focus thereinafter specifically on financial market applications. NNs have not gained full acceptance here yet. One of the main reasons is the "Black Box" problem (lack of the NN decisions explanatory power. There are though some NN decisions rule extraction methods like decompositional, pedagogical or eclectic, but they suffer from low portability of the rule extraction technique across various neural net architectures, high level of granularity, algorithmic sophistication of the rule extraction technique etc. The authors propose to eliminate some known drawbacks using an innovative extension of the pedagogical approach. The idea is exposed by the use of a widespread MLP neural net (as a common tool in the financial problems' domain and SOM (input data space clusterization. The feedback of both nets' performance is related and targeted through the iteration cycle by achievement of the best matching between the decision space fragments and input data space clusters. Three sets of rules are generated algorithmically or by fuzzy membership functions. Empirical validation of the common financial benchmark problems is conducted with an appropriately prepared software solution.

  2. Rules Extraction with an Immune Algorithm

    Directory of Open Access Journals (Sweden)

    Deqin Yan

    2007-12-01

    Full Text Available In this paper, a method of extracting rules with immune algorithms from information systems is proposed. Designing an immune algorithm is based on a sharing mechanism to extract rules. The principle of sharing and competing resources in the sharing mechanism is consistent with the relationship of sharing and rivalry among rules. In order to extract rules efficiently, a new concept of flexible confidence and rule measurement is introduced. Experiments demonstrate that the proposed method is effective.

  3. Getting Objects Methods and Interactions by Extracting Business Rules from Legacy Systems

    Directory of Open Access Journals (Sweden)

    Omar El Beggar

    2014-08-01

    Full Text Available The maintenance of legacy systems becomes over the years extremely complex and highly expensive due to the incessant changes of company activities and policies. In this case, a new or an improved system must replace the previous one. However, replacing those systems completely from scratch is also very expensive and it represents a huge risk. The optimal scenario is evolving those systems by profiting from the valuable knowledge embedded in them. This paper aims to present an approach for knowledge acquisition from existing legacy systems by extracting business rules from source code. In fact, the business rules are extracted and assigned next to the domain entities in order to generate objects methods and interactions in an object-oriented platform. Furthermore, a rules translation in natural language is given. The aim is advancing a solution for re-engineering legacy systems, minimize the cost of their modernization and keep very small the gap between the company business and the renovated systems.

  4. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    R Smit (Rob)

    2011-01-01

    htmlabstractThis thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible

  5. Extraction of Static and Dynamic Reservoir Operation Rules by Genetic Programming

    Directory of Open Access Journals (Sweden)

    Habib Akbari Alashti

    2014-11-01

    Full Text Available Considering the necessity of desirable operation of limited water resources and assuming the significant role of dams in controlling and consuming the surface waters, highlights the advantageous of suitable operation rules for optimal and sustainable operation of dams. This study investigates the hydroelectric supply of a one-reservoir system of Karoon3 using nonlinear programming (NLP, genetic algorithm (GA, genetic programming (GP and fixed length gen GP (FLGGP in real-time operation of dam considering two approaches of static and dynamic operation rules. In static operation rule, only one rule curve is extracted for all months in a year whereas in dynamic operation rule, monthly rule curves (12 rules are extracted for each month of a year. In addition, nonlinear decision rule (NLDR curves are considered, and the total deficiency function as the target (objective function have been used for evaluating the performance of each method and approach. Results show appropriate efficiency of GP and FLGGP methods in extracting operation rules in both approaches. Superiority of these methods to operation methods yielded by GA and NLP is 5%. Moreover, according to the results, it can be remarked that, FLGGP method is an alternative for GP method, whereas the GP method cannot be used due to its limitations. Comparison of two approaches of static and dynamic operation rules demonstrated the superiority of dynamic operation rule to static operation rule (about 10% and therefore this method has more capabilities in real-time operation of the reservoirs systems.

  6. Measures of Ruleset Quality for General Rules Extraction Methods

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin

    2009-01-01

    Roč. 50, č. 6 (2009), s. 867-879 ISSN 0888-613X R&D Projects: GA ČR GA201/08/0802 Institutional research plan: CEZ:AV0Z10300504 Keywords : rules extraction from data * quality measures * ruleset measures * ROC curves * observational logic * fuzzy logic Subject RIV: IN - Informatics, Computer Science Impact factor: 2.090, year: 2009

  7. Rule Extraction from Support Vector Machines: A Geometric Approach

    OpenAIRE

    Ren, L.

    2008-01-01

    Despite the success of connectionist systems in prediction and classi¯cation problems, critics argue that the lack of symbol processing and explanation capability makes them less competitive than symbolic systems. Rule extraction from neural networks makes the interpretation of the behaviour of connectionist networks possible by relating sub-symbolic and symbolic process- ing. However, most rule extraction methods focus only on speci¯c neural network architectures and present limited generali...

  8. Use of a Recursive-Rule eXtraction algorithm with J48graft to achieve highly accurate and concise rule extraction from a large breast cancer dataset

    Directory of Open Access Journals (Sweden)

    Yoichi Hayashi

    Full Text Available To assist physicians in the diagnosis of breast cancer and thereby improve survival, a highly accurate computer-aided diagnostic system is necessary. Although various machine learning and data mining approaches have been devised to increase diagnostic accuracy, most current methods are inadequate. The recently developed Recursive-Rule eXtraction (Re-RX algorithm provides a hierarchical, recursive consideration of discrete variables prior to analysis of continuous data, and can generate classification rules that have been trained on the basis of both discrete and continuous attributes. The objective of this study was to extract highly accurate, concise, and interpretable classification rules for diagnosis using the Re-RX algorithm with J48graft, a class for generating a grafted C4.5 decision tree. We used the Wisconsin Breast Cancer Dataset (WBCD. Nine research groups provided 10 kinds of highly accurate concrete classification rules for the WBCD. We compared the accuracy and characteristics of the rule set for the WBCD generated using the Re-RX algorithm with J48graft with five rule sets obtained using 10-fold cross validation (CV. We trained the WBCD using the Re-RX algorithm with J48graft and the average classification accuracies of 10 runs of 10-fold CV for the training and test datasets, the number of extracted rules, and the average number of antecedents for the WBCD. Compared with other rule extraction algorithms, the Re-RX algorithm with J48graft resulted in a lower average number of rules for diagnosing breast cancer, which is a substantial advantage. It also provided the lowest average number of antecedents per rule. These features are expected to greatly aid physicians in making accurate and concise diagnoses for patients with breast cancer. Keywords: Breast cancer diagnosis, Rule extraction, Re-RX algorithm, J48graft, C4.5

  9. Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis

    International Nuclear Information System (INIS)

    Wang, M; Hu, N Q; Qin, G J

    2011-01-01

    In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.

  10. Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, M; Hu, N Q; Qin, G J, E-mail: hnq@nudt.edu.cn, E-mail: wm198063@yahoo.com.cn [School of Mechatronic Engineering and Automation, National University of Defense Technology, ChangSha, Hunan, 410073 (China)

    2011-07-19

    In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.

  11. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  12. The linear rule of metal extraction

    International Nuclear Information System (INIS)

    Han Li; Yushuang Wang; Zhichun Chen; Shulan Meng

    1988-01-01

    On the basis of experimental results a linear rule for the solvent extraction of rare earths and yttrium over a definite range of acidity and metal ion concentration is found. Relations between the coefficients of the linear rule and initial acidity are presented for the extraction systems: HEH(EHP)-kerosene-HNO 3 -R(NO 3 ) 3 , HEH(EHP)-kerosene-HCl-RCl 3 , D2EHPA-n-heptane-HCl-RCl 3 , HEH(EHP)-n-heptane-HCl-RCl 3 , where R=La-Nd, Sm-Lu, Y; HEH(EHP)=momo(2-ethyl-hexyl)2-ethyl-hexyl phosphonate; D2EHPA=di(2-ethyl-hexyl) phosphoric acid. 3 refs.; 1 tab

  13. Sleep promotes the extraction of grammatical rules.

    Directory of Open Access Journals (Sweden)

    Ingrid L C Nieuwenhuis

    Full Text Available Grammar acquisition is a high level cognitive function that requires the extraction of complex rules. While it has been proposed that offline time might benefit this type of rule extraction, this remains to be tested. Here, we addressed this question using an artificial grammar learning paradigm. During a short-term memory cover task, eighty-one human participants were exposed to letter sequences generated according to an unknown artificial grammar. Following a time delay of 15 min, 12 h (wake or sleep or 24 h, participants classified novel test sequences as Grammatical or Non-Grammatical. Previous behavioral and functional neuroimaging work has shown that classification can be guided by two distinct underlying processes: (1 the holistic abstraction of the underlying grammar rules and (2 the detection of sequence chunks that appear at varying frequencies during exposure. Here, we show that classification performance improved after sleep. Moreover, this improvement was due to an enhancement of rule abstraction, while the effect of chunk frequency was unaltered by sleep. These findings suggest that sleep plays a critical role in extracting complex structure from separate but related items during integrative memory processing. Our findings stress the importance of alternating periods of learning with sleep in settings in which complex information must be acquired.

  14. TEMPTING system: a hybrid method of rule and machine learning for temporal relation extraction in patient discharge summaries.

    Science.gov (United States)

    Chang, Yung-Chun; Dai, Hong-Jie; Wu, Johnny Chi-Yang; Chen, Jian-Ming; Tsai, Richard Tzong-Han; Hsu, Wen-Lian

    2013-12-01

    Patient discharge summaries provide detailed medical information about individuals who have been hospitalized. To make a precise and legitimate assessment of the abundant data, a proper time layout of the sequence of relevant events should be compiled and used to drive a patient-specific timeline, which could further assist medical personnel in making clinical decisions. The process of identifying the chronological order of entities is called temporal relation extraction. In this paper, we propose a hybrid method to identify appropriate temporal links between a pair of entities. The method combines two approaches: one is rule-based and the other is based on the maximum entropy model. We develop an integration algorithm to fuse the results of the two approaches. All rules and the integration algorithm are formally stated so that one can easily reproduce the system and results. To optimize the system's configuration, we used the 2012 i2b2 challenge TLINK track dataset and applied threefold cross validation to the training set. Then, we evaluated its performance on the training and test datasets. The experiment results show that the proposed TEMPTING (TEMPoral relaTion extractING) system (ranked seventh) achieved an F-score of 0.563, which was at least 30% better than that of the baseline system, which randomly selects TLINK candidates from all pairs and assigns the TLINK types. The TEMPTING system using the hybrid method also outperformed the stage-based TEMPTING system. Its F-scores were 3.51% and 0.97% better than those of the stage-based system on the training set and test set, respectively. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Horizontal and Vertical Rule Bases Method in Fuzzy Controllers

    Directory of Open Access Journals (Sweden)

    Sadegh Aminifar

    2013-01-01

    Full Text Available Concept of horizontal and vertical rule bases is introduced. Using this method enables the designers to look for main behaviors of system and describes them with greater approximations. The rules which describe the system in first stage are called horizontal rule base. In the second stage, the designer modulates the obtained surface by describing needed changes on first surface for handling real behaviors of system. The rules used in the second stage are called vertical rule base. Horizontal and vertical rule bases method has a great roll in easing of extracting the optimum control surface by using too lesser rules than traditional fuzzy systems. This research involves with control of a system with high nonlinearity and in difficulty to model it with classical methods. As a case study for testing proposed method in real condition, the designed controller is applied to steaming room with uncertain data and variable parameters. A comparison between PID and traditional fuzzy counterpart and our proposed system shows that our proposed system outperforms PID and traditional fuzzy systems in point of view of number of valve switching and better surface following. The evaluations have done both with model simulation and DSP implementation.

  16. Extracting pronunciation rules for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-04-01

    Full Text Available Various automated techniques can be used to generalise from phonemic lexicons through the extraction of grapheme-to-phoneme rule sets. These techniques are particularly useful when developing pronunciation models for previously unmodelled languages...

  17. Use of the recursive-rule extraction algorithm with continuous attributes to improve diagnostic accuracy in thyroid disease

    Directory of Open Access Journals (Sweden)

    Yoichi Hayashi

    Full Text Available Thyroid diseases, which often lead to thyroid dysfunction involving either hypo- or hyperthyroidism, affect hundreds of millions of people worldwide, many of whom remain undiagnosed; however, diagnosis is difficult because symptoms are similar to those seen in a number of other conditions. The objective of this study was to assess the effectiveness of the Recursive-Rule Extraction (Re-RX algorithm with continuous attributes (Continuous Re-RX in extracting highly accurate, concise, and interpretable classification rules for the diagnosis of thyroid disease. We used the 7200-sample Thyroid dataset from the University of California Irvine Machine Learning Repository, a large and highly imbalanced dataset that comprises both discrete and continuous attributes. We trained the dataset using Continuous Re-RX, and after obtaining the maximum training and test accuracies, the number of extracted rules, and the average number of antecedents, we compared the results with those of other extraction methods. Our results suggested that Continuous Re-RX not only achieved the highest accuracy for diagnosing thyroid disease compared with the other methods, but also provided simple, concise, and interpretable rules. Based on these results, we believe that the use of Continuous Re-RX in machine learning may assist healthcare professionals in the diagnosis of thyroid disease. Keywords: Thyroid disease diagnosis, Re-RX algorithm, Rule extraction, Decision tree

  18. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    Directory of Open Access Journals (Sweden)

    Yang Li

    Full Text Available In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA methods, a new rule extraction method based on extreme learning machine (ELM and an improved Ant-miner (IAM algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

  19. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    Science.gov (United States)

    Li, Yang; Li, Guoqing; Wang, Zhenhao

    2015-01-01

    In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA) methods, a new rule extraction method based on extreme learning machine (ELM) and an improved Ant-miner (IAM) algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

  20. An unsupervised text mining method for relation extraction from biomedical literature.

    Directory of Open Access Journals (Sweden)

    Changqin Quan

    Full Text Available The wealth of interaction information provided in biomedical articles motivated the implementation of text mining approaches to automatically extract biomedical relations. This paper presents an unsupervised method based on pattern clustering and sentence parsing to deal with biomedical relation extraction. Pattern clustering algorithm is based on Polynomial Kernel method, which identifies interaction words from unlabeled data; these interaction words are then used in relation extraction between entity pairs. Dependency parsing and phrase structure parsing are combined for relation extraction. Based on the semi-supervised KNN algorithm, we extend the proposed unsupervised approach to a semi-supervised approach by combining pattern clustering, dependency parsing and phrase structure parsing rules. We evaluated the approaches on two different tasks: (1 Protein-protein interactions extraction, and (2 Gene-suicide association extraction. The evaluation of task (1 on the benchmark dataset (AImed corpus showed that our proposed unsupervised approach outperformed three supervised methods. The three supervised methods are rule based, SVM based, and Kernel based separately. The proposed semi-supervised approach is superior to the existing semi-supervised methods. The evaluation on gene-suicide association extraction on a smaller dataset from Genetic Association Database and a larger dataset from publicly available PubMed showed that the proposed unsupervised and semi-supervised methods achieved much higher F-scores than co-occurrence based method.

  1. Automating the Extraction of Metadata from Archaeological Data Using iRods Rules

    Directory of Open Access Journals (Sweden)

    David Walling

    2011-10-01

    Full Text Available The Texas Advanced Computing Center and the Institute for Classical Archaeology at the University of Texas at Austin developed a method that uses iRods rules and a Jython script to automate the extraction of metadata from digital archaeological data. The first step was to create a record-keeping system to classify the data. The record-keeping system employs file and directory hierarchy naming conventions designed specifically to maintain the relationship between the data objects and map the archaeological documentation process. The metadata implicit in the record-keeping system is automatically extracted upon ingest, combined with additional sources of metadata, and stored alongside the data in the iRods preservation environment. This method enables a more organized workflow for the researchers, helps them archive their data close to the moment of data creation, and avoids error prone manual metadata input. We describe the types of metadata extracted and provide technical details of the extraction process and storage of the data and metadata.

  2. Summary of water body extraction methods based on ZY-3 satellite

    Science.gov (United States)

    Zhu, Yu; Sun, Li Jian; Zhang, Chuan Yin

    2017-12-01

    Extracting from remote sensing images is one of the main means of water information extraction. Affected by spectral characteristics, many methods can be not applied to the satellite image of ZY-3. To solve this problem, we summarize the extraction methods for ZY-3 and analyze the extraction results of existing methods. According to the characteristics of extraction results, the method of WI& single band threshold and the method of texture filtering based on probability statistics are explored. In addition, the advantages and disadvantages of all methods are compared, which provides some reference for the research of water extraction from images. The obtained conclusions are as follows. 1) NIR has higher water sensitivity, consequently when the surface reflectance in the study area is less similar to water, using single band threshold method or multi band operation can obtain the ideal effect. 2) Compared with the water index and HIS optimal index method, object extraction method based on rules, which takes into account not only the spectral information of the water, but also space and texture feature constraints, can obtain better extraction effect, yet the image segmentation process is time consuming and the definition of the rules requires a certain knowledge. 3) The combination of the spectral relationship and water index can eliminate the interference of the shadow to a certain extent. When there is less small water or small water is not considered in further study, texture filtering based on probability statistics can effectively reduce the noises in result and avoid mixing shadows or paddy field with water in a certain extent.

  3. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  4. Sensory Intelligence for Extraction of an Abstract Auditory Rule: A Cross-Linguistic Study.

    Science.gov (United States)

    Guo, Xiao-Tao; Wang, Xiao-Dong; Liang, Xiu-Yuan; Wang, Ming; Chen, Lin

    2018-02-21

    In a complex linguistic environment, while speech sounds can greatly vary, some shared features are often invariant. These invariant features constitute so-called abstract auditory rules. Our previous study has shown that with auditory sensory intelligence, the human brain can automatically extract the abstract auditory rules in the speech sound stream, presumably serving as the neural basis for speech comprehension. However, whether the sensory intelligence for extraction of abstract auditory rules in speech is inherent or experience-dependent remains unclear. To address this issue, we constructed a complex speech sound stream using auditory materials in Mandarin Chinese, in which syllables had a flat lexical tone but differed in other acoustic features to form an abstract auditory rule. This rule was occasionally and randomly violated by the syllables with the rising, dipping or falling tone. We found that both Chinese and foreign speakers detected the violations of the abstract auditory rule in the speech sound stream at a pre-attentive stage, as revealed by the whole-head recordings of mismatch negativity (MMN) in a passive paradigm. However, MMNs peaked earlier in Chinese speakers than in foreign speakers. Furthermore, Chinese speakers showed different MMN peak latencies for the three deviant types, which paralleled recognition points. These findings indicate that the sensory intelligence for extraction of abstract auditory rules in speech sounds is innate but shaped by language experience. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  5. Ant-based extraction of rules in simple decision systems over ontological graphs

    Directory of Open Access Journals (Sweden)

    Pancerz Krzysztof

    2015-06-01

    Full Text Available In the paper, the problem of extraction of complex decision rules in simple decision systems over ontological graphs is considered. The extracted rules are consistent with the dominance principle similar to that applied in the dominancebased rough set approach (DRSA. In our study, we propose to use a heuristic algorithm, utilizing the ant-based clustering approach, searching the semantic spaces of concepts presented by means of ontological graphs. Concepts included in the semantic spaces are values of attributes describing objects in simple decision systems

  6. Different neurophysiological mechanisms underlying word and rule extraction from speech.

    Directory of Open Access Journals (Sweden)

    Ruth De Diego Balaguer

    Full Text Available The initial process of identifying words from spoken language and the detection of more subtle regularities underlying their structure are mandatory processes for language acquisition. Little is known about the cognitive mechanisms that allow us to extract these two types of information and their specific time-course of acquisition following initial contact with a new language. We report time-related electrophysiological changes that occurred while participants learned an artificial language. These changes strongly correlated with the discovery of the structural rules embedded in the words. These changes were clearly different from those related to word learning and occurred during the first minutes of exposition. There is a functional distinction in the nature of the electrophysiological signals during acquisition: an increase in negativity (N400 in the central electrodes is related to word-learning and development of a frontal positivity (P2 is related to rule-learning. In addition, the results of an online implicit and a post-learning test indicate that, once the rules of the language have been acquired, new words following the rule are processed as words of the language. By contrast, new words violating the rule induce syntax-related electrophysiological responses when inserted online in the stream (an early frontal negativity followed by a late posterior positivity and clear lexical effects when presented in isolation (N400 modulation. The present study provides direct evidence suggesting that the mechanisms to extract words and structural dependencies from continuous speech are functionally segregated. When these mechanisms are engaged, the electrophysiological marker associated with rule-learning appears very quickly, during the earliest phases of exposition to a new language.

  7. Rule-based Approach on Extraction of Malay Compound Nouns in Standard Malay Document

    Science.gov (United States)

    Abu Bakar, Zamri; Kamal Ismail, Normaly; Rawi, Mohd Izani Mohamed

    2017-08-01

    Malay compound noun is defined as a form of words that exists when two or more words are combined into a single syntax and it gives a specific meaning. Compound noun acts as one unit and it is spelled separately unless an established compound noun is written closely from two words. The basic characteristics of compound noun can be seen in the Malay sentences which are the frequency of that word in the text itself. Thus, this extraction of compound nouns is significant for the following research which is text summarization, grammar checker, sentiments analysis, machine translation and word categorization. There are many research efforts that have been proposed in extracting Malay compound noun using linguistic approaches. Most of the existing methods were done on the extraction of bi-gram noun+noun compound. However, the result still produces some problems as to give a better result. This paper explores a linguistic method for extracting compound Noun from stand Malay corpus. A standard dataset are used to provide a common platform for evaluating research on the recognition of compound Nouns in Malay sentences. Therefore, an improvement for the effectiveness of the compound noun extraction is needed because the result can be compromised. Thus, this study proposed a modification of linguistic approach in order to enhance the extraction of compound nouns processing. Several pre-processing steps are involved including normalization, tokenization and tagging. The first step that uses the linguistic approach in this study is Part-of-Speech (POS) tagging. Finally, we describe several rules-based and modify the rules to get the most relevant relation between the first word and the second word in order to assist us in solving of the problems. The effectiveness of the relations used in our study can be measured using recall, precision and F1-score techniques. The comparison of the baseline values is very essential because it can provide whether there has been an improvement

  8. Extracting Cross-Ontology Weighted Association Rules from Gene Ontology Annotations.

    Science.gov (United States)

    Agapito, Giuseppe; Milano, Marianna; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-01-01

    Gene Ontology (GO) is a structured repository of concepts (GO Terms) that are associated to one or more gene products through a process referred to as annotation. The analysis of annotated data is an important opportunity for bioinformatics. There are different approaches of analysis, among those, the use of association rules (AR) which provides useful knowledge, discovering biologically relevant associations between terms of GO, not previously known. In a previous work, we introduced GO-WAR (Gene Ontology-based Weighted Association Rules), a methodology for extracting weighted association rules from ontology-based annotated datasets. We here adapt the GO-WAR algorithm to mine cross-ontology association rules, i.e., rules that involve GO terms present in the three sub-ontologies of GO. We conduct a deep performance evaluation of GO-WAR by mining publicly available GO annotated datasets, showing how GO-WAR outperforms current state of the art approaches.

  9. Extraction of design rules from multi-objective design exploration (MODE) using rough set theory

    International Nuclear Information System (INIS)

    Obayashi, Shigeru

    2011-01-01

    Multi-objective design exploration (MODE) and its application for design rule extraction are presented. MODE reveals the structure of design space from the trade-off information. The self-organizing map (SOM) is incorporated into MODE as a visual data-mining tool for design space. SOM divides the design space into clusters with specific design features. The sufficient conditions for belonging to a cluster of interest are extracted using rough set theory. The resulting MODE was applied to the multidisciplinary wing design problem, which revealed a cluster of good designs, and we extracted the design rules of such designs successfully.

  10. The research on business rules classification and specification methods

    OpenAIRE

    Baltrušaitis, Egidijus

    2005-01-01

    The work is based on the research of business rules classification and specification methods. The basics of business rules approach are discussed. The most common business rules classification and modeling methods are analyzed. Business rules modeling techniques and tools for supporting them in the information systems are presented. Basing on the analysis results business rules classification method is proposed. Templates for every business rule type are presented. Business rules structuring ...

  11. SAR Data Fusion Imaging Method Oriented to Target Feature Extraction

    Directory of Open Access Journals (Sweden)

    Yang Wei

    2015-02-01

    Full Text Available To deal with the difficulty for target outlines extracting precisely due to neglect of target scattering characteristic variation during the processing of high-resolution space-borne SAR data, a novel fusion imaging method is proposed oriented to target feature extraction. Firstly, several important aspects that affect target feature extraction and SAR image quality are analyzed, including curved orbit, stop-and-go approximation, atmospheric delay, and high-order residual phase error. Furthermore, the corresponding compensation methods are addressed as well. Based on the analysis, the mathematical model of SAR echo combined with target space-time spectrum is established for explaining the space-time-frequency change rule of target scattering characteristic. Moreover, a fusion imaging strategy and method under high-resolution and ultra-large observation angle range conditions are put forward to improve SAR quality by fusion processing in range-doppler and image domain. Finally, simulations based on typical military targets are used to verify the effectiveness of the fusion imaging method.

  12. Max-out-in pivot rule with Dantzig's safeguarding rule for the simplex method

    International Nuclear Information System (INIS)

    Tipawanna, Monsicha; Sinapiromsaran, Krung

    2014-01-01

    The simplex method is used to solve linear programming problem by improving the current basic feasible solution. It uses a pivot rule to guide the search in the feasible region. The pivot rule is used to select an entering index in simplex method. Nowadays, many pivot rule have been presented, but no pivot rule shows superior performance than other. Therefore, this is still an active research in linear programming. In this research, we present the max-out-in pivot rule with Dantzig's safeguarding for simplex method. This rule is based on maximum improvement of objective value of the current basic feasible point similar to the Dantzig's rule. We can illustrate by Klee and Minty problems that our rule outperforms that of Dantzig's rule by the number of iterations for solving linear programming problems

  13. Process identification through modular neural networks and rule extraction (extended abstract)

    NARCIS (Netherlands)

    van der Zwaag, B.J.; Slump, Cornelis H.; Spaanenburg, L.; Blockeel, Hendrik; Denecker, Marc

    2002-01-01

    Monolithic neural networks may be trained from measured data to establish knowledge about the process. Unfortunately, this knowledge is not guaranteed to be found and – if at all – hard to extract. Modular neural networks are better suited for this purpose. Domain-ordered by topology, rule

  14. FUZZY MODELING BY SUCCESSIVE ESTIMATION OF RULES ...

    African Journals Online (AJOL)

    This paper presents an algorithm for automatically deriving fuzzy rules directly from a set of input-output data of a process for the purpose of modeling. The rules are extracted by a method termed successive estimation. This method is used to generate a model without truncating the number of fired rules, to within user ...

  15. Hydrogen extraction from liquid lithium-lead alloy by gas-liquid contact method

    International Nuclear Information System (INIS)

    Xie Bo; Weng Kuiping; Hou Jianping; Yang Guangling; Zeng Jun

    2013-01-01

    Hydrogen extraction experiment from liquid lithium-lead alloy by gas-liquid contact method has been carried out in own liquid lithium-lead bubbler (LLLB). Experimental results show that, He is more suitable than Ar as carrier gas in the filler tower. The higher temperature the tower is, the greater hydrogen content the tower exports. Influence of carrier gas flow rate on the hydrogen content in the export is jagged, no obvious rule. Although the difference between experimental results and literature data, but it is feasible that hydrogen isotopes extraction experiment from liquid lithium-lead by gas-liquid contact method, and the higher extraction efficiency increases with the growth of the residence time of the alloy in tower. (authors)

  16. Connecting clinical and actuarial prediction with rule-based methods.

    Science.gov (United States)

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  17. A Comparison Study on Rule Extraction from Neural Network Ensembles, Boosted Shallow Trees, and SVMs

    OpenAIRE

    Bologna, Guido; Hayashi, Yoichi

    2018-01-01

    One way to make the knowledge stored in an artificial neural network more intelligible is to extract symbolic rules. However, producing rules from Multilayer Perceptrons (MLPs) is an NP-hard problem. Many techniques have been introduced to generate rules from single neural networks, but very few were proposed for ensembles. Moreover, experiments were rarely assessed by 10-fold cross-validation trials. In this work, based on the Discretized Interpretable Multilayer Perceptron (DIMLP), experime...

  18. Sum rules in the response function method

    International Nuclear Information System (INIS)

    Takayanagi, Kazuo

    1990-01-01

    Sum rules in the response function method are studied in detail. A sum rule can be obtained theoretically by integrating the imaginary part of the response function over the excitation energy with a corresponding energy weight. Generally, the response function is calculated perturbatively in terms of the residual interaction, and the expansion can be described by diagrammatic methods. In this paper, we present a classification of the diagrams so as to clarify which diagram has what contribution to which sum rule. This will allow us to get insight into the contributions to the sum rules of all the processes expressed by Goldstone diagrams. (orig.)

  19. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  20. Extraction of spatial-temporal rules from mesoscale eddies in the South China Sea Based on rough set theory

    Science.gov (United States)

    Du, Y.; Fan, X.; He, Z.; Su, F.; Zhou, C.; Mao, H.; Wang, D.

    2011-06-01

    In this paper, a rough set theory is introduced to represent spatial-temporal relationships and extract the corresponding rules from typical mesoscale-eddy states in the South China Sea (SCS). Three decision attributes are adopted in this study, which make the approach flexible in retrieving spatial-temporal rules with different features. Spatial-temporal rules of typical states in the SCS are extracted as three decision attributes, which then are confirmed by the previous works. The results demonstrate that this approach is effective in extracting spatial-temporal rules from typical mesoscale-eddy states, and therefore provides a powerful approach to forecasts in the future. Spatial-temporal rules in the SCS indicate that warm eddies following the rules are generally in the southeastern and central SCS around 2000 m isobaths in winter. Their intensity and vorticity are weaker than those of cold eddies. They usually move a shorter distance. By contrast, cold eddies are in 2000 m-deeper regions of the southwestern and northeastern SCS in spring and fall. Their intensity and vorticity are strong. Usually they move a long distance. In winter, a few rules are followed by cold eddies in the northern tip of the basin and southwest of Taiwan Island rather than warm eddies, indicating cold eddies may be well-regulated in the region. Several warm-eddy rules are achieved west of Luzon Island, indicating warm eddies may be well-regulated in the region as well. Otherwise, warm and cold eddies are distributed not only in the jet flow off southern Vietnam induced by intraseasonal wind stress in summer-fall, but also in the northern shallow water, which should be a focus of future study.

  1. Automatic Extraction of Urban Built-Up Area Based on Object-Oriented Method and Remote Sensing Data

    Science.gov (United States)

    Li, L.; Zhou, H.; Wen, Q.; Chen, T.; Guan, F.; Ren, B.; Yu, H.; Wang, Z.

    2018-04-01

    Built-up area marks the use of city construction land in the different periods of the development, the accurate extraction is the key to the studies of the changes of urban expansion. This paper studies the technology of automatic extraction of urban built-up area based on object-oriented method and remote sensing data, and realizes the automatic extraction of the main built-up area of the city, which saves the manpower cost greatly. First, the extraction of construction land based on object-oriented method, the main technical steps include: (1) Multi-resolution segmentation; (2) Feature Construction and Selection; (3) Information Extraction of Construction Land Based on Rule Set, The characteristic parameters used in the rule set mainly include the mean of the red band (Mean R), Normalized Difference Vegetation Index (NDVI), Ratio of residential index (RRI), Blue band mean (Mean B), Through the combination of the above characteristic parameters, the construction site information can be extracted. Based on the degree of adaptability, distance and area of the object domain, the urban built-up area can be quickly and accurately defined from the construction land information without depending on other data and expert knowledge to achieve the automatic extraction of the urban built-up area. In this paper, Beijing city as an experimental area for the technical methods of the experiment, the results show that: the city built-up area to achieve automatic extraction, boundary accuracy of 2359.65 m to meet the requirements. The automatic extraction of urban built-up area has strong practicality and can be applied to the monitoring of the change of the main built-up area of city.

  2. A rule-based automatic sleep staging method.

    Science.gov (United States)

    Liang, Sheng-Fu; Kuo, Chin-En; Hu, Yu-Han; Cheng, Yu-Shian

    2012-03-30

    In this paper, a rule-based automatic sleep staging method was proposed. Twelve features including temporal and spectrum analyses of the EEG, EOG, and EMG signals were utilized. Normalization was applied to each feature to eliminating individual differences. A hierarchical decision tree with fourteen rules was constructed for sleep stage classification. Finally, a smoothing process considering the temporal contextual information was applied for the continuity. The overall agreement and kappa coefficient of the proposed method applied to the all night polysomnography (PSG) of seventeen healthy subjects compared with the manual scorings by R&K rules can reach 86.68% and 0.79, respectively. This method can integrate with portable PSG system for sleep evaluation at-home in the near future. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. A study on method to identify actual causes and conditions of safety rule deviations through analyzing events due to unsafe acts of workers

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke

    2010-01-01

    The purpose of this study is to establish a method to understand actual causes and condition of intentional deviation from safety rules (including norm and written rules that has developed to anticipate, prevent, detect and recover human errors) in an organization by analyzing events due to unsafe acts of workers (human factor events) and to propose effective measures. Firstly, by reviewing literature regarding safety violations, the following two advantages of investigating actual condition of safety rule deviation through human factor event analysis were extracted, such as (a) being able to clarify relationships between deviations, human errors, and events, and (b) being able to identify specific causal factors that influenced the decision to deviate, including acts of people concerned, problems with rules, task demands, environment and management. Next, through the analysis of human factor event data in accordance with existing human error analysis method on the basis of advantages above, the following three requirements for analyzing event data were extracted, such as (a) gathering information such as rules concerning to the work activities related to the human factor events, and whether there are intentional deviations of the rules, (b) gathering information and identify interrelations among causal factors of the intentional deviations, and (c) gathering information on general condition of deviations and the causal factors. (author)

  4. Orthogonal search-based rule extraction for modelling the decision to transfuse.

    Science.gov (United States)

    Etchells, T A; Harrison, M J

    2006-04-01

    Data from an audit relating to transfusion decisions during intermediate or major surgery were analysed to determine the strengths of certain factors in the decision making process. The analysis, using orthogonal search-based rule extraction (OSRE) from a trained neural network, demonstrated that the risk of tissue hypoxia (ROTH) assessed using a 100-mm visual analogue scale, the haemoglobin value (Hb) and the presence or absence of on-going haemorrhage (OGH) were able to reproduce the transfusion decisions with a joint specificity of 0.96 and sensitivity of 0.93 and a positive predictive value of 0.9. The rules indicating transfusion were: 1. ROTH > 32 mm and Hb 13 mm and Hb 38 mm, Hb < 102 g x l(-1) and OGH; 4. Hb < 78 g x l(-1).

  5. Extraction Methods, Variability Encountered in

    NARCIS (Netherlands)

    Bodelier, P.L.E.; Nelson, K.E.

    2014-01-01

    Synonyms Bias in DNA extractions methods; Variation in DNA extraction methods Definition The variability in extraction methods is defined as differences in quality and quantity of DNA observed using various extraction protocols, leading to differences in outcome of microbial community composition

  6. Comparison of mentha extracts obtained by different extraction methods

    Directory of Open Access Journals (Sweden)

    Milić Slavica

    2006-01-01

    Full Text Available The different methods of mentha extraction, such as steam distillation, extraction by methylene chloride (Soxhlet extraction and supercritical fluid extraction (SFE by carbon dioxide (CO J were investigated. SFE by CO, was performed at pressure of 100 bar and temperature of40°C. The extraction yield, as well as qualitative and quantitative composition of obtained extracts, determined by GC-MS method, were compared.

  7. Rules of thumb and simplified methods

    International Nuclear Information System (INIS)

    Lahti, G.P.

    1985-01-01

    The author points out the value of a thorough grounding in fundamental physics combined with experience of applied practice when using simplified methods and rules of thumb in shield engineering. Present-day quality assurance procedures and good engineering practices require careful documentation of all calculations. The aforementioned knowledge of rules of thumb and back-of-the-envelope calculations can assure both the preparer and the reviewer that the results in the quality assurance documentation are the physically correct ones

  8. Horizontal and Vertical Rule Bases Method in Fuzzy Controllers

    OpenAIRE

    Aminifar, Sadegh; bin Marzuki, Arjuna

    2013-01-01

    Concept of horizontal and vertical rule bases is introduced. Using this method enables the designers to look for main behaviors of system and describes them with greater approximations. The rules which describe the system in first stage are called horizontal rule base. In the second stage, the designer modulates the obtained surface by describing needed changes on first surface for handling real behaviors of system. The rules used in the second stage are called vertical rule base. Horizontal...

  9. A Bayesian analysis of QCD sum rules

    International Nuclear Information System (INIS)

    Gubler, Philipp; Oka, Makoto

    2011-01-01

    A new technique has recently been developed, in which the Maximum Entropy Method is used to analyze QCD sum rules. This approach has the virtue of being able to directly generate the spectral function of a given operator, without the need of making an assumption about its specific functional form. To investigate whether useful results can be extracted within this method, we have first studied the vector meson channel, where QCD sum rules are traditionally known to provide a valid description of the spectral function. Our results show a significant peak in the region of the experimentally observed ρ-meson mass, which is in agreement with earlier QCD sum rules studies and suggests that the Maximum Entropy Method is a strong tool for analyzing QCD sum rules.

  10. A Comparison Study on Rule Extraction from Neural Network Ensembles, Boosted Shallow Trees, and SVMs

    Directory of Open Access Journals (Sweden)

    Guido Bologna

    2018-01-01

    Full Text Available One way to make the knowledge stored in an artificial neural network more intelligible is to extract symbolic rules. However, producing rules from Multilayer Perceptrons (MLPs is an NP-hard problem. Many techniques have been introduced to generate rules from single neural networks, but very few were proposed for ensembles. Moreover, experiments were rarely assessed by 10-fold cross-validation trials. In this work, based on the Discretized Interpretable Multilayer Perceptron (DIMLP, experiments were performed on 10 repetitions of stratified 10-fold cross-validation trials over 25 binary classification problems. The DIMLP architecture allowed us to produce rules from DIMLP ensembles, boosted shallow trees (BSTs, and Support Vector Machines (SVM. The complexity of rulesets was measured with the average number of generated rules and average number of antecedents per rule. From the 25 used classification problems, the most complex rulesets were generated from BSTs trained by “gentle boosting” and “real boosting.” Moreover, we clearly observed that the less complex the rules were, the better their fidelity was. In fact, rules generated from decision stumps trained by modest boosting were, for almost all the 25 datasets, the simplest with the highest fidelity. Finally, in terms of average predictive accuracy and average ruleset complexity, the comparison of some of our results to those reported in the literature proved to be competitive.

  11. Interesting association rule mining with consistent and inconsistent rule detection from big sales data in distributed environment

    Directory of Open Access Journals (Sweden)

    Dinesh J. Prajapati

    2017-06-01

    Full Text Available Nowadays, there is an increasing demand in mining interesting patterns from the big data. The process of analyzing such a huge amount of data is really computationally complex task when using traditional methods. The overall purpose of this paper is in twofold. First, this paper presents a novel approach to identify consistent and inconsistent association rules from sales data located in distributed environment. Secondly, the paper also overcomes the main memory bottleneck and computing time overhead of single computing system by applying computations to multi node cluster. The proposed method initially extracts frequent itemsets for each zone using existing distributed frequent pattern mining algorithms. The paper also compares the time efficiency of Mapreduce based frequent pattern mining algorithm with Count Distribution Algorithm (CDA and Fast Distributed Mining (FDM algorithms. The association generated from frequent itemsets are too large that it becomes complex to analyze it. Thus, Mapreduce based consistent and inconsistent rule detection (MR-CIRD algorithm is proposed to detect the consistent and inconsistent rules from big data and provide useful and actionable knowledge to the domain experts. These pruned interesting rules also give useful knowledge for better marketing strategy as well. The extracted consistent and inconsistent rules are evaluated and compared based on different interestingness measures presented together with experimental results that lead to the final conclusions.

  12. Systematic generation of rules for nuclear power plant diagnostics

    International Nuclear Information System (INIS)

    Reifman, J.; Lee, J.C.

    1988-01-01

    The knowledge base of an expert system is generally represented by a set of heuristic rules derived from the expert's own experience and judgmental knowledge. These heuristic or production rules are cast as if (condition), then (consequence) statements, and represent, for nuclear power plant diagnostic systems, information connecting symptoms to failures. In this paper, the authors apply an entropy minimax pattern recognition algorithm to automate the process of extracting and encoding knowledge into a set of rules. Knowledge is extracted by recognizing patterns in plant parameters or symptoms associated with failures or transient events, and is encoded by casting the discovered patterns as production rules. The paper discusses how the proposed method can systematically generate rules that characterize failure of pressurizer components based on transient events analyzed with a pressurizer components based on transient events analyzed with a pressurizer water reactor simulator program

  13. A GIS-based multi-criteria seismic vulnerability assessment using the integration of granular computing rule extraction and artificial neural networks

    NARCIS (Netherlands)

    Sheikhian, Hossein; Delavar, Mahmoud Reza; Stein, Alfred

    2017-01-01

    This study proposes multi‐criteria group decision‐making to address seismic physical vulnerability assessment. Granular computing rule extraction is combined with a feed forward artificial neural network to form a classifier capable of training a neural network on the basis of the rules provided by

  14. A comprehensive benchmark of kernel methods to extract protein-protein interactions from literature.

    Directory of Open Access Journals (Sweden)

    Domonkos Tikk

    Full Text Available The most important way of conveying new findings in biomedical research is scientific publication. Extraction of protein-protein interactions (PPIs reported in scientific publications is one of the core topics of text mining in the life sciences. Recently, a new class of such methods has been proposed - convolution kernels that identify PPIs using deep parses of sentences. However, comparing published results of different PPI extraction methods is impossible due to the use of different evaluation corpora, different evaluation metrics, different tuning procedures, etc. In this paper, we study whether the reported performance metrics are robust across different corpora and learning settings and whether the use of deep parsing actually leads to an increase in extraction quality. Our ultimate goal is to identify the one method that performs best in real-life scenarios, where information extraction is performed on unseen text and not on specifically prepared evaluation data. We performed a comprehensive benchmarking of nine different methods for PPI extraction that use convolution kernels on rich linguistic information. Methods were evaluated on five different public corpora using cross-validation, cross-learning, and cross-corpus evaluation. Our study confirms that kernels using dependency trees generally outperform kernels based on syntax trees. However, our study also shows that only the best kernel methods can compete with a simple rule-based approach when the evaluation prevents information leakage between training and test corpora. Our results further reveal that the F-score of many approaches drops significantly if no corpus-specific parameter optimization is applied and that methods reaching a good AUC score often perform much worse in terms of F-score. We conclude that for most kernels no sensible estimation of PPI extraction performance on new text is possible, given the current heterogeneity in evaluation data. Nevertheless, our study

  15. Studying Operation Rules of Cascade Reservoirs Based on Multi-Dimensional Dynamics Programming

    Directory of Open Access Journals (Sweden)

    Zhiqiang Jiang

    2017-12-01

    Full Text Available Although many optimization models and methods are applied to the optimization of reservoir operation at present, the optimal operation decision that is made through these models and methods is just a retrospective review. Due to the limitation of hydrological prediction accuracy, it is practical and feasible to obtain the suboptimal or satisfactory solution by the established operation rules in the actual reservoir operation, especially for the mid- and long-term operation. In order to obtain the optimized sample data with global optimality; and make the extracted operation rules more reasonable and reliable, this paper presents the multi-dimensional dynamic programming model of the optimal joint operation of cascade reservoirs and provides the corresponding recursive equation and the specific solving steps. Taking Li Xianjiang cascade reservoirs as a case study, seven uncertain problems in the whole operation period of the cascade reservoirs are summarized after a detailed analysis to the obtained optimal sample data, and two sub-models are put forward to solve these uncertain problems. Finally, by dividing the whole operation period into four characteristic sections, this paper extracts the operation rules of each reservoir for each section respectively. When compared the simulation results of the extracted operation rules with the conventional joint operation method; the result indicates that the power generation of the obtained rules has a certain degree of improvement both in inspection years and typical years (i.e., wet year; normal year and dry year. So, the rationality and effectiveness of the extracted operation rules are verified by the comparative analysis.

  16. QCD sum rules in a Bayesian approach

    International Nuclear Information System (INIS)

    Gubler, Philipp; Oka, Makoto

    2011-01-01

    A novel technique is developed, in which the Maximum Entropy Method is used to analyze QCD sum rules. The main advantage of this approach lies in its ability of directly generating the spectral function of a given operator. This is done without the need of making an assumption about the specific functional form of the spectral function, such as in the 'pole + continuum' ansatz that is frequently used in QCD sum rule studies. Therefore, with this method it should in principle be possible to distinguish narrow pole structures form continuum states. To check whether meaningful results can be extracted within this approach, we have first investigated the vector meson channel, where QCD sum rules are traditionally known to provide a valid description of the spectral function. Our results exhibit a significant peak in the region of the experimentally observed ρ-meson mass, which agrees with earlier QCD sum rules studies and shows that the Maximum Entropy Method is a useful tool for analyzing QCD sum rules.

  17. Sanitizing sensitive association rules using fuzzy correlation scheme

    International Nuclear Information System (INIS)

    Hameed, S.; Shahzad, F.; Asghar, S.

    2013-01-01

    Data mining is used to extract useful information hidden in the data. Sometimes this extraction of information leads to revealing sensitive information. Privacy preservation in Data Mining is a process of sanitizing sensitive information. This research focuses on sanitizing sensitive rules discovered in quantitative data. The proposed scheme, Privacy Preserving in Fuzzy Association Rules (PPFAR) is based on fuzzy correlation analysis. In this work, fuzzy set concept is integrated with fuzzy correlation analysis and Apriori algorithm to mark interesting fuzzy association rules. The identified rules are called sensitive. For sanitization, we use modification technique where we substitute maximum value of fuzzy items with zero, which occurs most frequently. Experiments demonstrate that PPFAR method hides sensitive rules with minimum modifications. The technique also maintains the modified data's quality. The PPFAR scheme has applications in various domains e.g. temperature control, medical analysis, travel time prediction, genetic behavior prediction etc. We have validated the results on medical dataset. (author)

  18. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  19. Defining collaborative business rules management solutions : framework and method

    NARCIS (Netherlands)

    dr. Martijn Zoet; Johan Versendaal

    2014-01-01

    From the publishers' website: The goal of this research is to define a method for configuring a collaborative business rules management solution from a value proposition perspective. In an earlier published study (Business rules management solutions: added value by means of business

  20. 26 CFR 1.446-1 - General rule for methods of accounting.

    Science.gov (United States)

    2010-04-01

    .... Although a method of accounting may exist under this definition without the necessity of a pattern of... 26 Internal Revenue 6 2010-04-01 2010-04-01 false General rule for methods of accounting. 1.446-1... TAX (CONTINUED) INCOME TAXES Methods of Accounting § 1.446-1 General rule for methods of accounting...

  1. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  2. A robust approach to extract biomedical events from literature.

    Science.gov (United States)

    Bui, Quoc-Chinh; Sloot, Peter M A

    2012-10-15

    The abundance of biomedical literature has attracted significant interest in novel methods to automatically extract biomedical relations from the literature. Until recently, most research was focused on extracting binary relations such as protein-protein interactions and drug-disease relations. However, these binary relations cannot fully represent the original biomedical data. Therefore, there is a need for methods that can extract fine-grained and complex relations known as biomedical events. In this article we propose a novel method to extract biomedical events from text. Our method consists of two phases. In the first phase, training data are mapped into structured representations. Based on that, templates are used to extract rules automatically. In the second phase, extraction methods are developed to process the obtained rules. When evaluated against the Genia event extraction abstract and full-text test datasets (Task 1), we obtain results with F-scores of 52.34 and 53.34, respectively, which are comparable to the state-of-the-art systems. Furthermore, our system achieves superior performance in terms of computational efficiency. Our source code is available for academic use at http://dl.dropbox.com/u/10256952/BioEvent.zip.

  3. The Efficiency of Random Forest Method for Shoreline Extraction from LANDSAT-8 and GOKTURK-2 Imageries

    Science.gov (United States)

    Bayram, B.; Erdem, F.; Akpinar, B.; Ince, A. K.; Bozkurt, S.; Catal Reis, H.; Seker, D. Z.

    2017-11-01

    Coastal monitoring plays a vital role in environmental planning and hazard management related issues. Since shorelines are fundamental data for environment management, disaster management, coastal erosion studies, modelling of sediment transport and coastal morphodynamics, various techniques have been developed to extract shorelines. Random Forest is one of these techniques which is used in this study for shoreline extraction.. This algorithm is a machine learning method based on decision trees. Decision trees analyse classes of training data creates rules for classification. In this study, Terkos region has been chosen for the proposed method within the scope of "TUBITAK Project (Project No: 115Y718) titled "Integration of Unmanned Aerial Vehicles for Sustainable Coastal Zone Monitoring Model - Three-Dimensional Automatic Coastline Extraction and Analysis: Istanbul-Terkos Example". Random Forest algorithm has been implemented to extract the shoreline of the Black Sea where near the lake from LANDSAT-8 and GOKTURK-2 satellite imageries taken in 2015. The MATLAB environment was used for classification. To obtain land and water-body classes, the Random Forest method has been applied to NIR bands of LANDSAT-8 (5th band) and GOKTURK-2 (4th band) imageries. Each image has been digitized manually and shorelines obtained for accuracy assessment. According to accuracy assessment results, Random Forest method is efficient for both medium and high resolution images for shoreline extraction studies.

  4. THE EFFICIENCY OF RANDOM FOREST METHOD FOR SHORELINE EXTRACTION FROM LANDSAT-8 AND GOKTURK-2 IMAGERIES

    Directory of Open Access Journals (Sweden)

    B. Bayram

    2017-11-01

    Full Text Available Coastal monitoring plays a vital role in environmental planning and hazard management related issues. Since shorelines are fundamental data for environment management, disaster management, coastal erosion studies, modelling of sediment transport and coastal morphodynamics, various techniques have been developed to extract shorelines. Random Forest is one of these techniques which is used in this study for shoreline extraction.. This algorithm is a machine learning method based on decision trees. Decision trees analyse classes of training data creates rules for classification. In this study, Terkos region has been chosen for the proposed method within the scope of "TUBITAK Project (Project No: 115Y718 titled "Integration of Unmanned Aerial Vehicles for Sustainable Coastal Zone Monitoring Model – Three-Dimensional Automatic Coastline Extraction and Analysis: Istanbul-Terkos Example". Random Forest algorithm has been implemented to extract the shoreline of the Black Sea where near the lake from LANDSAT-8 and GOKTURK-2 satellite imageries taken in 2015. The MATLAB environment was used for classification. To obtain land and water-body classes, the Random Forest method has been applied to NIR bands of LANDSAT-8 (5th band and GOKTURK-2 (4th band imageries. Each image has been digitized manually and shorelines obtained for accuracy assessment. According to accuracy assessment results, Random Forest method is efficient for both medium and high resolution images for shoreline extraction studies.

  5. 48 CFR 6302.30 - Alternative dispute resolution methods (Rule 30).

    Science.gov (United States)

    2010-10-01

    ... TRANSPORTATION BOARD OF CONTRACT APPEALS RULES OF PROCEDURE 6302.30 Alternative dispute resolution methods (Rule... Alternative Dispute Resolution (ADR): Settlement Judges and Mini-Trials. These procedures are designed to... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Alternative dispute...

  6. Applications of rule-induction in the derivation of quantitative structure-activity relationships

    Science.gov (United States)

    A-Razzak, Mohammed; Glen, Robert C.

    1992-08-01

    Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.

  7. Using association rule mining to identify risk factors for early childhood caries.

    Science.gov (United States)

    Ivančević, Vladimir; Tušek, Ivan; Tušek, Jasmina; Knežević, Marko; Elheshk, Salaheddin; Luković, Ivan

    2015-11-01

    Early childhood caries (ECC) is a potentially severe disease affecting children all over the world. The available findings are mostly based on a logistic regression model, but data mining, in particular association rule mining, could be used to extract more information from the same data set. ECC data was collected in a cross-sectional analytical study of the 10% sample of preschool children in the South Bačka area (Vojvodina, Serbia). Association rules were extracted from the data by association rule mining. Risk factors were extracted from the highly ranked association rules. Discovered dominant risk factors include male gender, frequent breastfeeding (with other risk factors), high birth order, language, and low body weight at birth. Low health awareness of parents was significantly associated to ECC only in male children. The discovered risk factors are mostly confirmed by the literature, which corroborates the value of the methods. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Research on Fault Diagnosis Method Based on Rule Base Neural Network

    Directory of Open Access Journals (Sweden)

    Zheng Ni

    2017-01-01

    Full Text Available The relationship between fault phenomenon and fault cause is always nonlinear, which influences the accuracy of fault location. And neural network is effective in dealing with nonlinear problem. In order to improve the efficiency of uncertain fault diagnosis based on neural network, a neural network fault diagnosis method based on rule base is put forward. At first, the structure of BP neural network is built and the learning rule is given. Then, the rule base is built by fuzzy theory. An improved fuzzy neural construction model is designed, in which the calculated methods of node function and membership function are also given. Simulation results confirm the effectiveness of this method.

  9. Looking for exceptions on knowledge rules induced from HIV cleavage data set

    Directory of Open Access Journals (Sweden)

    Ronaldo Cristiano Prati

    2004-01-01

    Full Text Available The aim of data mining is to find useful knowledge inout of databases. In order to extract such knowledge, several methods can be used, among them machine learning (ML algorithms. In this work we focus on ML algorithms that express the extracted knowledge in a symbolic form, such as rules. This representation may allow us to ''explain'' the data. Rule learning algorithms are mainly designed to induce classification rules that can predict new cases with high accuracy. However, these sorts of rules generally express common sense knowledge, resulting in many interesting and useful rules not being discovered. Furthermore, the domain independent biases, especially those related to the language used to express the induced knowledge, could induce rules that are difficult to understand. Exceptions might be used in order to overcome these drawbacks. Exceptions are defined as rules that contradict common believebeliefs. This kind of rules can play an important role in the process of understanding the underlying data as well as in making critical decisions. By contradicting the user's common beliefves, exceptions are bound to be interesting. This work proposes a method to find exceptions. In order to illustrate the potential of our approach, we apply the method in a real world data set to discover rules and exceptions in the HIV virus protein cleavage process. A good understanding of the process that generates this data plays an important role oin the research of cleavage inhibitors. We consider believe that the proposed approach may help the domain expert to further understand this process.

  10. A method for automatically extracting infectious disease-related primers and probes from the literature

    Directory of Open Access Journals (Sweden)

    Pérez-Rey David

    2010-08-01

    Full Text Available Abstract Background Primer and probe sequences are the main components of nucleic acid-based detection systems. Biologists use primers and probes for different tasks, some related to the diagnosis and prescription of infectious diseases. The biological literature is the main information source for empirically validated primer and probe sequences. Therefore, it is becoming increasingly important for researchers to navigate this important information. In this paper, we present a four-phase method for extracting and annotating primer/probe sequences from the literature. These phases are: (1 convert each document into a tree of paper sections, (2 detect the candidate sequences using a set of finite state machine-based recognizers, (3 refine problem sequences using a rule-based expert system, and (4 annotate the extracted sequences with their related organism/gene information. Results We tested our approach using a test set composed of 297 manuscripts. The extracted sequences and their organism/gene annotations were manually evaluated by a panel of molecular biologists. The results of the evaluation show that our approach is suitable for automatically extracting DNA sequences, achieving precision/recall rates of 97.98% and 95.77%, respectively. In addition, 76.66% of the detected sequences were correctly annotated with their organism name. The system also provided correct gene-related information for 46.18% of the sequences assigned a correct organism name. Conclusions We believe that the proposed method can facilitate routine tasks for biomedical researchers using molecular methods to diagnose and prescribe different infectious diseases. In addition, the proposed method can be expanded to detect and extract other biological sequences from the literature. The extracted information can also be used to readily update available primer/probe databases or to create new databases from scratch.

  11. Impact of different extraction methods on the quality of Dipteryx alata extracts

    Directory of Open Access Journals (Sweden)

    Frederico S. Martins

    2013-05-01

    Full Text Available This study aimed to impact of different extraction methods on the quality of Dipteryx alata Vogel, Fabaceae, extracts from fruits. The major compounds found were the lipids 38.9% (w/w and proteins 26.20% (w/w. The residual moisture was 7.20% (w/w, total fiber 14.50% (w/w, minerals 4.10% (w/w and carbohydrate 9.10 % (w/w. The species studied has great potential in producing oil, but the content and type of fatty acids obtained is dependent on the method of extraction. The Blingh & Dyer method was more selective for unsaturated fatty acids and Shoxlet method was more selective for saturated fatty acids. The tannin extraction by ultrasound (33.70 % w/w was 13.90% more efficient than extraction by decoction (29 % w/w.

  12. Class Association Rule Pada Metode Associative Classification

    Directory of Open Access Journals (Sweden)

    Eka Karyawati

    2011-11-01

    Full Text Available Frequent patterns (itemsets discovery is an important problem in associative classification rule mining.  Differents approaches have been proposed such as the Apriori-like, Frequent Pattern (FP-growth, and Transaction Data Location (Tid-list Intersection algorithm. This paper focuses on surveying and comparing the state of the art associative classification techniques with regards to the rule generation phase of associative classification algorithms.  This phase includes frequent itemsets discovery and rules mining/extracting methods to generate the set of class association rules (CARs.  There are some techniques proposed to improve the rule generation method.  A technique by utilizing the concepts of discriminative power of itemsets can reduce the size of frequent itemset.  It can prune the useless frequent itemsets. The closed frequent itemset concept can be utilized to compress the rules to be compact rules.  This technique may reduce the size of generated rules.  Other technique is in determining the support threshold value of the itemset. Specifying not single but multiple support threshold values with regard to the class label frequencies can give more appropriate support threshold value.  This technique may generate more accurate rules. Alternative technique to generate rule is utilizing the vertical layout to represent dataset.  This method is very effective because it only needs one scan over dataset, compare with other techniques that need multiple scan over dataset.   However, one problem with these approaches is that the initial set of tid-lists may be too large to fit into main memory. It requires more sophisticated techniques to compress the tid-lists.

  13. Steroid hormones in environmental matrices: extraction method comparison.

    Science.gov (United States)

    Andaluri, Gangadhar; Suri, Rominder P S; Graham, Kendon

    2017-11-09

    The U.S. Environmental Protection Agency (EPA) has developed methods for the analysis of steroid hormones in water, soil, sediment, and municipal biosolids by HRGC/HRMS (EPA Method 1698). Following the guidelines provided in US-EPA Method 1698, the extraction methods were validated with reagent water and applied to municipal wastewater, surface water, and municipal biosolids using GC/MS/MS for the analysis of nine most commonly detected steroid hormones. This is the first reported comparison of the separatory funnel extraction (SFE), continuous liquid-liquid extraction (CLLE), and Soxhlet extraction methods developed by the U.S. EPA. Furthermore, a solid phase extraction (SPE) method was also developed in-house for the extraction of steroid hormones from aquatic environmental samples. This study provides valuable information regarding the robustness of the different extraction methods. Statistical analysis of the data showed that SPE-based methods provided better recovery efficiencies and lower variability of the steroid hormones followed by SFE. The analytical methods developed in-house for extraction of biosolids showed a wide recovery range; however, the variability was low (≤ 7% RSD). Soxhlet extraction and CLLE are lengthy procedures and have been shown to provide highly variably recovery efficiencies. The results of this study are guidance for better sample preparation strategies in analytical methods for steroid hormone analysis, and SPE adds to the choice in environmental sample analysis.

  14. Backtracking Method of Coloring Administrative Maps Considering Visual Perception Rules

    Directory of Open Access Journals (Sweden)

    WEI Zhiwei

    2018-03-01

    Full Text Available Color design in administrative maps should incorporate and balance area configuration, color harmony, and users' purposes. Based on visual perceptual rules, this paper quantifies color harmony, color contrast and perceptual balance in coloring administrative maps, and a model is suggested to evaluate the coloring quality after color template is selected. Then a backtracking method based on area balance is proposed to compute colored areas. Experiments show that this method can well meet visual perceptual rules while coloring administrative maps, and can be used for later map design.

  15. A linguistic rule-based approach to extract drug-drug interactions from pharmacological documents.

    Science.gov (United States)

    Segura-Bedmar, Isabel; Martínez, Paloma; de Pablo-Sánchez, César

    2011-03-29

    A drug-drug interaction (DDI) occurs when one drug influences the level or activity of another drug. The increasing volume of the scientific literature overwhelms health care professionals trying to be kept up-to-date with all published studies on DDI. This paper describes a hybrid linguistic approach to DDI extraction that combines shallow parsing and syntactic simplification with pattern matching. Appositions and coordinate structures are interpreted based on shallow syntactic parsing provided by the UMLS MetaMap tool (MMTx). Subsequently, complex and compound sentences are broken down into clauses from which simple sentences are generated by a set of simplification rules. A pharmacist defined a set of domain-specific lexical patterns to capture the most common expressions of DDI in texts. These lexical patterns are matched with the generated sentences in order to extract DDIs. We have performed different experiments to analyze the performance of the different processes. The lexical patterns achieve a reasonable precision (67.30%), but very low recall (14.07%). The inclusion of appositions and coordinate structures helps to improve the recall (25.70%), however, precision is lower (48.69%). The detection of clauses does not improve the performance. Information Extraction (IE) techniques can provide an interesting way of reducing the time spent by health care professionals on reviewing the literature. Nevertheless, no approach has been carried out to extract DDI from texts. To the best of our knowledge, this work proposes the first integral solution for the automatic extraction of DDI from biomedical texts.

  16. Evaluation of urinary cortisol excretion by radioimmunoassay through two methods (extracted and non-extracted)

    International Nuclear Information System (INIS)

    Fonte Kohek, M.B. da; Mendonca, B.B. de; Nicolau, W.

    1993-01-01

    The objective of this paper is to compare the feasibility, sensitivity and specificity of both methods (extracted versus non-extracted) in the hypercortisolism diagnosis. It used Gamma Coat 125 cortisol Kit provided by Clinical Assays, Incstar, USA, for both methods extracting it with methylene chloride in order to measure the extracted cortisol. It was performed 32 assays from which it was obtained from 0.1 to 0.47 u g/d l of sensitivity. The intra-run precision was varied from 8.29 +- 3.38% and 8.19 +-4.72% for high and low levels, respectively for non-extracted cortisol, and 9.72 +- 1.94% and 9.54 +- 44% for high and low levels, respectively, for extracted cortisol. The inter-run precision was 15.98% and 16.15% for high level of non-extracted cortisol, respectively. For the low level it obtained 17.25% and 18.59% for non-extracted and extracted cortisol respectively. It was evaluated 24-hour urine basal samples from 43 normal subjects, and 53 obese (body mass index > 30) and 53 Cushing's syndrome patients. The sensitivity of the methods were similar (100% and 98.1% for non-extracted and extracted methods, respectively) and the specificity was the same for both methods (100%). It was noticed a positive correlation between the two methods in all the groups studied (p s syndrome. (author)

  17. Evaluation of urinary excretion of cortisol by radioimmunoassay through two methods (extracted and non-extracted)

    International Nuclear Information System (INIS)

    Fonte Kohek, M.B. da.

    1992-01-01

    The radioimmunoassay of urinary cortisol extracted by organic solvent (free cortisol) has been used for along time in the hypercortisolism diagnosis. With the development of more specific antisera it became possible to measure urinary cortisol without extracting it. The objective of this paper is to compare the feasibility, sensitivity and specificity of both methods (extracted versus non-extracted) in the hypercortisolism diagnosis. It was used Gamma Coat 125 I - cortisol kit provided by Clinical Assay, Incstar, US, for both methods extracting it with methylene chloride in order to measure the extracted cortisol. The sensitivity of the methods were similar (100% and 98,1%, for non-extracted and extracted methods, respectively). A positive correlation between the two methods was noticed in all groups studied (p < 0.05). It was concluded that both methods are efficient for the investigation of hypercortisolism. However, it's suggested that non-extracted urinary cortisol measurement should be the method of choice since it's an easy-to-perform and affordable method to diagnose Cushing's syndrome. (author)

  18. Analysis of the iteratively regularized Gauss-Newton method under a heuristic rule

    Science.gov (United States)

    Jin, Qinian; Wang, Wei

    2018-03-01

    The iteratively regularized Gauss-Newton method is one of the most prominent regularization methods for solving nonlinear ill-posed inverse problems when the data is corrupted by noise. In order to produce a useful approximate solution, this iterative method should be terminated properly. The existing a priori and a posteriori stopping rules require accurate information on the noise level, which may not be available or reliable in practical applications. In this paper we propose a heuristic selection rule for this regularization method, which requires no information on the noise level. By imposing certain conditions on the noise, we derive a posteriori error estimates on the approximate solutions under various source conditions. Furthermore, we establish a convergence result without using any source condition. Numerical results are presented to illustrate the performance of our heuristic selection rule.

  19. Extracting Date/Time Expressions in Super-Function Based Japanese-English Machine Translation

    Science.gov (United States)

    Sasayama, Manabu; Kuroiwa, Shingo; Ren, Fuji

    Super-Function Based Machine Translation(SFBMT) which is a type of Example-Based Machine Translation has a feature which makes it possible to expand the coverage of examples by changing nouns into variables, however, there were problems extracting entire date/time expressions containing parts-of-speech other than nouns, because only nouns/numbers were changed into variables. We describe a method for extracting date/time expressions for SFBMT. SFBMT uses noun determination rules to extract nouns and a bilingual dictionary to obtain correspondence of the extracted nouns between the source and the target languages. In this method, we add a rule to extract date/time expressions and then extract date/time expressions from a Japanese-English bilingual corpus. The evaluation results shows that the precision of this method for Japanese sentences is 96.7%, with a recall of 98.2% and the precision for English sentences is 94.7%, with a recall of 92.7%.

  20. Pochhammer symbol with negative indices. A new rule for the method of brackets

    Directory of Open Access Journals (Sweden)

    Gonzalez Ivan

    2016-01-01

    Full Text Available The method of brackets is a method of integration based upon a small number of heuristic rules. Some of these have been made rigorous. An example of an integral involving the Bessel function is used to motivate a new evaluation rule.

  1. Extracting natural dyes from wool--an evaluation of extraction methods.

    Science.gov (United States)

    Manhita, Ana; Ferreira, Teresa; Candeias, António; Dias, Cristina Barrocas

    2011-05-01

    The efficiency of eight different procedures used for the extraction of natural dyes was evaluated using contemporary wool samples dyed with cochineal, madder, woad, weld, brazilwood and logwood. Comparison was made based on the LC-DAD peak areas of the natural dye's main components which had been extracted from the wool samples. Among the tested methods, an extraction procedure with Na(2)EDTA in water/DMF (1:1, v/v) proved to be the most suitable for the extraction of the studied dyes, which presented a wide range of chemical structures. The identification of the natural dyes used in the making of an eighteenth century Arraiolos carpet was possible using the Na(2)EDTA/DMF extraction of the wool embroidery samples and an LC-DAD-MS methodology. The effectiveness of the Na(2)EDTA/DMF extraction method was particularly observed in the extraction of weld dye components. Nine flavone derivatives previously identified in weld extracts could be identified in a single historical sample, confirming the use of this natural dye in the making of Arraiolos carpets. Indigo and brazilwood were also identified in the samples, and despite the fact that these natural dyes were referred in the historical recipes of Arraiolos dyeing, it is the first time that the use of brazilwood is confirmed. Mordant analysis by ICP-MS identified the widespread use of alum in the dyeing process, but in some samples with darker hues, high amounts of iron were found instead.

  2. Using GO-WAR for mining cross-ontology weighted association rules.

    Science.gov (United States)

    Agapito, Giuseppe; Cannataro, Mario; Guzzi, Pietro Hiram; Milano, Marianna

    2015-07-01

    The Gene Ontology (GO) is a structured repository of concepts (GO terms) that are associated to one or more gene products. The process of association is referred to as annotation. The relevance and the specificity of both GO terms and annotations are evaluated by a measure defined as information content (IC). The analysis of annotated data is thus an important challenge for bioinformatics. There exist different approaches of analysis. From those, the use of association rules (AR) may provide useful knowledge, and it has been used in some applications, e.g. improving the quality of annotations. Nevertheless classical association rules algorithms do not take into account the source of annotation nor the importance yielding to the generation of candidate rules with low IC. This paper presents GO-WAR (Gene Ontology-based Weighted Association Rules) a methodology for extracting weighted association rules. GO-WAR can extract association rules with a high level of IC without loss of support and confidence from a dataset of annotated data. A case study on using of GO-WAR on publicly available GO annotation datasets is used to demonstrate that our method outperforms current state of the art approaches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Comparative Analysis of Reduced-Rule Compressed Fuzzy Logic Control and Incremental Conductance MPPT Methods

    Science.gov (United States)

    Kandemir, Ekrem; Borekci, Selim; Cetin, Numan S.

    2018-04-01

    Photovoltaic (PV) power generation has been widely used in recent years, with techniques for increasing the power efficiency representing one of the most important issues. The available maximum power of a PV panel is dependent on environmental conditions such as solar irradiance and temperature. To extract the maximum available power from a PV panel, various maximum-power-point tracking (MPPT) methods are used. In this work, two different MPPT methods were implemented for a 150-W PV panel. The first method, known as incremental conductance (Inc. Cond.) MPPT, determines the maximum power by measuring the derivative of the PV voltage and current. The other method is based on reduced-rule compressed fuzzy logic control (RR-FLC), using which it is relatively easier to determine the maximum power because a single input variable is used to reduce computing loads. In this study, a 150-W PV panel system model was realized using these MPPT methods in MATLAB and the results compared. According to the simulation results, the proposed RR-FLC-based MPPT could increase the response rate and tracking accuracy by 4.66% under standard test conditions.

  4. Shadow Analysis Technique for Extraction of Building Height using High Resolution Satellite Single Image and Accuracy Assessment

    Science.gov (United States)

    Raju, P. L. N.; Chaudhary, H.; Jha, A. K.

    2014-11-01

    These High resolution satellite data with metadata information is used to extract the height of the building using shadow. Proposed approach divides into two phases 1) rooftop and shadow extraction and 2) height estimation. Firstly the rooftop and shadow region were extracted by manual/ automatic methods using Example - Based and Rule - Based approaches. After feature extraction next step is estimating height of the building by taking rooftop in association with shadow using Ratio Method and by using the relation between sun-satellite geometry. The performance analysis shows the total mean error of height is 0.67 m from ratio method, 1.51 m from Example - Based Approach and 0.96 m from Rule - Based Approach. Analysis concluded that Ratio Method i.e. manual method is best for height estimation but it is time consuming so the automatic Rule Based approach is best for height estimation in comparison to Example Based Approach because it require more knowledge and selection of more training samples as well as slows the processing rate of the method.

  5. Development of {sup 99m}Tc extraction-recovery by solvent extraction method

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, Akihiro; Nishikata, Kaori; Izumo, Hironobu; Tsuchiya, Kunihiko; Ishihara, Masahiro [Japan Atomic Energy Agency, Oarai Research and Development Center, Oarai, Ibaraki (Japan); Tanase, Masakazu; Fujisaki, Saburo; Shiina, Takayuki; Ohta, Akio; Takeuchi, Nobuhiro [Chiyoda Technol Corp., Tokyo (Japan)

    2012-03-15

    {sup 99m}Tc is used as a radiopharmaceutical in the medical field for the diagnosis, and manufactured from {sup 99}Mo, the parent nuclide. In this study, the solvent extraction with MEK was selected, and preliminary experiments were carried out using Re instead of {sup 99m}Tc. Two tests were carried out in the experiments; the one is the Re extraction test with MEK from Re-Mo solution, the other is the Re recovery test from the Re-MEK. As to the Re extraction test, and it was clear that the Re extraction yield was more than 90%. Two kinds of Re recovery tests, which are an evaporation method using the evaporator and an adsorption/elution method using the alumina column, were carried out. As to the evaporation method, the Re concentration in the collected solution increased more than 150 times. As to the adsorption/elution method, the Re concentration increased in the eluted solution more than 20 times. (author)

  6. Obtaining a minimal set of rewrite rules

    CSIR Research Space (South Africa)

    Davel, M

    2005-11-01

    Full Text Available In this paper the authors describe a new approach to rewrite rule extraction and analysis, using Minimal Representation Graphs. This approach provides a mechanism for obtaining the smallest possible rule set – within a context-dependent rewrite rule...

  7. Virgin almond oil: Extraction methods and composition

    Energy Technology Data Exchange (ETDEWEB)

    Roncero, J.M.; Alvarez-Orti, M.; Pardo-Gimenez, A.; Gomez, R.; Rabadan, A.; Pardo, J.E.

    2016-07-01

    In this paper the extraction methods of virgin almond oil and its chemical composition are reviewed. The most common methods for obtaining oil are solvent extraction, extraction with supercritical fluids (CO2) and pressure systems (hydraulic and screw presses). The best industrial performance, but also the worst oil quality is achieved by using solvents. Oils obtained by this method cannot be considered virgin oils as they are obtained by chemical treatments. Supercritical fluid extraction results in higher quality oils but at a very high price. Extraction by pressing becomes the best option to achieve high quality oils at an affordable price. With regards chemical composition, almond oil is characterized by its low content in saturated fatty acids and the predominance of monounsaturated, especially oleic acid. Furthermore, almond oil contains antioxidants and fat-soluble bioactive compounds that make it an oil with interesting nutritional and cosmetic properties.

  8. Virgin almond oil: Extraction methods and composition

    International Nuclear Information System (INIS)

    Roncero, J.M.; Alvarez-Orti, M.; Pardo-Gimenez, A.; Gomez, R.; Rabadan, A.; Pardo, J.E.

    2016-01-01

    In this paper the extraction methods of virgin almond oil and its chemical composition are reviewed. The most common methods for obtaining oil are solvent extraction, extraction with supercritical fluids (CO2) and pressure systems (hydraulic and screw presses). The best industrial performance, but also the worst oil quality is achieved by using solvents. Oils obtained by this method cannot be considered virgin oils as they are obtained by chemical treatments. Supercritical fluid extraction results in higher quality oils but at a very high price. Extraction by pressing becomes the best option to achieve high quality oils at an affordable price. With regards chemical composition, almond oil is characterized by its low content in saturated fatty acids and the predominance of monounsaturated, especially oleic acid. Furthermore, almond oil contains antioxidants and fat-soluble bioactive compounds that make it an oil with interesting nutritional and cosmetic properties.

  9. Influence of extraction methods on the hepatotoxicity of Azadirachta ...

    African Journals Online (AJOL)

    The influence of extraction methods: Cold aqueous (CA) hot aqueous (HA) and alcoholic extraction (AE) methods on the hepatotoxic effect of Azadirachta indica bark extract (ABC) was investigated using albino rats. A total of forty eight rats were divided into three groups of sixteen rats equally for the extraction methods.

  10. Charmonium spectrum at finite temperature from a Bayesian analysis of QCD sum rules

    Directory of Open Access Journals (Sweden)

    Morita Kenji

    2012-02-01

    Full Text Available Making use of a recently developed method of analyzing QCD sum rules, we investigate charmonium spectral functions at finite temperature. This method employs the Maximum Entropy Method, which makes it possible to directly obtain the spectral function from the sum rules, without having to introduce any strong assumption about its functional form. Finite temperature effects are incorporated into the sum rules by the change of the various gluonic condensates that appear in the operator product expansion. These changes depend on the energy density and pressure at finite temperature, which are extracted from lattice QCD. As a result, J/ψ and ηc dissolve into the continuum already at temperatures around 1.0 ~ 1.1 Tc.

  11. Influence of fuzzy norms and other heuristics on “Mixed fuzzy rule formation”

    OpenAIRE

    Gabriel, Thomas R.; Berthold, Michael R.

    2004-01-01

    In Mixed Fuzzy Rule Formation [Int. J. Approx. Reason. 32 (2003) 67] a method to extract mixed fuzzy rules from data was introduced. The underlying algorithm s performance is influenced by the choice of fuzzy t-norm and t-conorm, and a heuristic to avoid conflicts between patterns and rules of different classes throughout training. In the following addendum to [Int. J. Approx. Reason. 32 (2003) 67], we discuss in more depth how these parameters affect the generalization performance of the res...

  12. Extraction method

    International Nuclear Information System (INIS)

    Stary, J.; Kyrs, M.; Navratil, J.; Havelka, S.; Hala, J.

    1975-01-01

    Definitions of the basic terms and of relations are given and the knowledge is described of the possibilities of the extraction of elements, oxides, covalent-bound halogenides and heteropolyacids. Greatest attention is devoted to the detailed analysis of the extraction of chelates and ion associates using diverse agents. For both types of compounds detailed conditions are given of the separation and the effects of the individual factors are listed. Attention is also devoted to extractions using mixtures of organic agents, the synergic effects thereof, and to extractions in non-aqueous solvents. The effects of radiation on extraction and the main types of apparatus used for extractions carried out in the laboratory are described. (L.K.)

  13. DNA extraction on bio-chip: history and preeminence over conventional and solid-phase extraction methods.

    Science.gov (United States)

    Ayoib, Adilah; Hashim, Uda; Gopinath, Subash C B; Md Arshad, M K

    2017-11-01

    This review covers a developmental progression on early to modern taxonomy at cellular level following the advent of electron microscopy and the advancement in deoxyribonucleic acid (DNA) extraction for expatiation of biological classification at DNA level. Here, we discuss the fundamental values of conventional chemical methods of DNA extraction using liquid/liquid extraction (LLE) followed by development of solid-phase extraction (SPE) methods, as well as recent advances in microfluidics device-based system for DNA extraction on-chip. We also discuss the importance of DNA extraction as well as the advantages over conventional chemical methods, and how Lab-on-a-Chip (LOC) system plays a crucial role for the future achievements.

  14. Natural colorants: Pigment stability and extraction yield enhancement via utilization of appropriate pretreatment and extraction methods.

    Science.gov (United States)

    Ngamwonglumlert, Luxsika; Devahastin, Sakamon; Chiewchan, Naphaporn

    2017-10-13

    Natural colorants from plant-based materials have gained increasing popularity due to health consciousness of consumers. Among the many steps involved in the production of natural colorants, pigment extraction is one of the most important. Soxhlet extraction, maceration, and hydrodistillation are conventional methods that have been widely used in industry and laboratory for such a purpose. Recently, various non-conventional methods, such as supercritical fluid extraction, pressurized liquid extraction, microwave-assisted extraction, ultrasound-assisted extraction, pulsed-electric field extraction, and enzyme-assisted extraction have emerged as alternatives to conventional methods due to the advantages of the former in terms of smaller solvent consumption, shorter extraction time, and more environment-friendliness. Prior to the extraction step, pretreatment of plant materials to enhance the stability of natural pigments is another important step that must be carefully taken care of. In this paper, a comprehensive review of appropriate pretreatment and extraction methods for chlorophylls, carotenoids, betalains, and anthocyanins, which are major classes of plant pigments, is provided by using pigment stability and extraction yield as assessment criteria.

  15. Effective Diagnosis of Alzheimer's Disease by Means of Association Rules

    Science.gov (United States)

    Chaves, R.; Ramírez, J.; Górriz, J. M.; López, M.; Salas-Gonzalez, D.; Illán, I.; Segovia, F.; Padilla, P.

    In this paper we present a novel classification method of SPECT images for the early diagnosis of the Alzheimer's disease (AD). The proposed method is based on Association Rules (ARs) aiming to discover interesting associations between attributes contained in the database. The system uses firstly voxel-as-features (VAF) and Activation Estimation (AE) to find tridimensional activated brain regions of interest (ROIs) for each patient. These ROIs act as inputs to secondly mining ARs between activated blocks for controls, with a specified minimum support and minimum confidence. ARs are mined in supervised mode, using information previously extracted from the most discriminant rules for centering interest in the relevant brain areas, reducing the computational requirement of the system. Finally classification process is performed depending on the number of previously mined rules verified by each subject, yielding an up to 95.87% classification accuracy, thus outperforming recent developed methods for AD diagnosis.

  16. Extraction and identification of cyclobutanones from irradiated cheese employing a rapid direct solvent extraction method.

    Science.gov (United States)

    Tewfik, Ihab

    2008-01-01

    2-Alkylcyclobutanones (cyclobutanones) are accepted as chemical markers for irradiated foods containing lipid. However, current extraction procedures (Soxhlet-florisil chromatography) for the isolation of these markers involve a long and tedious clean-up regime prior to gas chromatography-mass spectrophotometry identification. This paper outlines an alternative isolation and clean-up method for the extraction of cyclobutanones in irradiated Camembert cheese. The newly developed direct solvent extraction method enables the efficient screening of large numbers of food samples and is not as resource intensive as the BS EN 1785:1997 method. Direct solvent extraction appears to be a simple, robust method and has the added advantage of a considerably shorter extraction time for the analysis of foods containing lipid.

  17. A comparison of five extraction methods for extracellular polymeric ...

    African Journals Online (AJOL)

    Two physical methods (centrifugation and ultrasonication) and 3 chemical methods (extraction with EDTA, extraction with formaldehyde, and extraction with formaldehyde plus NaOH) for extraction of EPS from alga-bacteria biofilm were assessed. Pretreatment with ultrasound at low intensity doubled the EPS yield without ...

  18. Effects of Different Extraction Methods and Conditions on the Phenolic Composition of Mate Tea Extracts

    Directory of Open Access Journals (Sweden)

    Jelena Vladic

    2012-03-01

    Full Text Available A simple and rapid HPLC method for determination of chlorogenic acid (5-O-caffeoylquinic acid in mate tea extracts was developed and validated. The chromatography used isocratic elution with a mobile phase of aqueous 1.5% acetic acid-methanol (85:15, v/v. The flow rate was 0.8 mL/min and detection by UV at 325 nm. The method showed good selectivity, accuracy, repeatability and robustness, with detection limit of 0.26 mg/L and recovery of 97.76%. The developed method was applied for the determination of chlorogenic acid in mate tea extracts obtained by ethanol extraction and liquid carbon dioxide extraction with ethanol as co-solvent. Different ethanol concentrations were used (40, 50 and 60%, v/v and liquid CO2 extraction was performed at different pressures (50 and 100 bar and constant temperature (27 ± 1 °C. Significant influence of extraction methods, conditions and solvent polarity on chlorogenic acid content, antioxidant activity and total phenolic and flavonoid content of mate tea extracts was established. The most efficient extraction solvent was liquid CO2 with aqueous ethanol (40% as co-solvent using an extraction pressure of 100 bar.

  19. Comparative study of adaptive controller using MIT rules and Lyapunov method for MPPT standalone PV systems

    Science.gov (United States)

    Tariba, N.; Bouknadel, A.; Haddou, A.; Ikken, N.; Omari, Hafsa El; Omari, Hamid El

    2017-01-01

    The Photovoltaic Generator have a nonlinear characteristic function relating the intensity at the voltage I = f (U) and depend on the variation of solar irradiation and temperature, In addition, its point of operation depends directly on the load that it supplies. To fix this drawback, and to extract the maximum power available to the terminal of the generator, an adaptation stage is introduced between the generator and the load to couple the two elements as perfectly as possible. The adaptation stage is associated with a command called MPPT MPPT (Maximum Power Point Tracker) whose is used to force the PVG to operate at the MPP (Maximum Power Point) under variation of climatic conditions and load variation. This paper presents a comparative study between the adaptive controller for PV Systems using MIT rules and Lyapunov method to regulate the PV voltage. The Incremental Conductance (IC) algorithm is used to extract the maximum power from the PVG by calculating the voltage Vref, and the adaptive controller is used to regulate and track quickly the PV voltage. The two methods of the adaptive controller will be compared to prove their performance by using the PSIM tools and experimental test, and the mathematical model of step-up with PVG model will be presented.

  20. Analysis of QCD sum rule based on the maximum entropy method

    International Nuclear Information System (INIS)

    Gubler, Philipp

    2012-01-01

    QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)

  1. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  2. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  3. System and method for free-boundary surface extraction

    KAUST Repository

    Algarni, Marei

    2017-10-26

    A method of extracting surfaces in three-dimensional data includes receiving as inputs three-dimensional data and a seed point p located on a surface to be extracted. The method further includes propagating a front outwardly from the seed point p and extracting a plurality of ridge curves based on the propagated front. A surface boundary is detected based on a comparison of distances between adjacent ridge curves and the desired surface is extracted based on the detected surface boundary.

  4. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  5. Impact of constraints and rules of user-involvement methods for IS concept creation and specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Ohno, Takehiko; Nakatani, Momoko

    2015-01-01

    ideas. In this paper, by exemplifying our user-involvement method with game elements, ICT Service Design Game, in comparison with conventional brainstorming, we show the impact of constraints and rules in user-involvement methods when creating service concepts and specifications for information systems....... The analysis is based on a comparative experiment on two design methods and shows that the constraints and rules of our game approach fostered innovative idea generation in spite of participants’ limited knowledge of and experience with design processes. Although our analysis is still in a preliminary stage......, it indicates some positive impact of constraints and rules in design methods, especially when the methods are used by non-design professionals....

  6. Knowledge extraction from evolving spiking neural networks with rank order population coding.

    Science.gov (United States)

    Soltic, Snjezana; Kasabov, Nikola

    2010-12-01

    This paper demonstrates how knowledge can be extracted from evolving spiking neural networks with rank order population coding. Knowledge discovery is a very important feature of intelligent systems. Yet, a disproportionally small amount of research is centered on the issue of knowledge extraction from spiking neural networks which are considered to be the third generation of artificial neural networks. The lack of knowledge representation compatibility is becoming a major detriment to end users of these networks. We show that a high-level knowledge can be obtained from evolving spiking neural networks. More specifically, we propose a method for fuzzy rule extraction from an evolving spiking network with rank order population coding. The proposed method was used for knowledge discovery on two benchmark taste recognition problems where the knowledge learnt by an evolving spiking neural network was extracted in the form of zero-order Takagi-Sugeno fuzzy IF-THEN rules.

  7. Evaluation of in vitro antioxidant potential of different polarities stem crude extracts by different extraction methods of Adenium obesum

    Directory of Open Access Journals (Sweden)

    Mohammad Amzad Hossain

    2014-09-01

    Full Text Available Objective: To select best extraction method for the isolated antioxidant compounds from the stems of Adenium obesum. Methods: Two methods used for the extraction are Soxhlet and maceration methods. Methanol solvent was used for both extraction method. The methanol crude extract was defatted with water and extracted successively with hexane, chloroform, ethyl acetate and butanol solvents. The antioxidant potential for all crude extracts were determined by using 1, 1-diphenyl-2- picrylhydrazyl method. Results: The percentage of extraction yield by Soxhlet method is higher compared to maceration method. The antioxidant potential for methanol and its derived fractions by Soxhlet extractor method was highest in ethyl acetate and lowest in hexane crude extracts and found in the order of ethyl acetate>butanol>water>chloroform>methanol>hexane. However, the antioxidant potential for methanol and its derived fractions by maceration method was highest in butanol and lowest in hexane followed in the order of butanol>methanol>chloroform>water>ethyl acetate>hexane. Conclusions: The results showed that isolate antioxidant compounds effected on the extraction method and condition of extraction.

  8. Influence of Extraction Parameters on Hydroalcohol Extracts of the ...

    African Journals Online (AJOL)

    The parameter that had the greatest influence on extraction process was alcohol concentration ... rules and processing steps [2]. As part .... Table 1: Extractive batch nnumbers with the respective factors and levels studied in the factorial design.

  9. Newer methods of extraction of teeth

    Directory of Open Access Journals (Sweden)

    MHendra Chandha

    2016-06-01

    Full Text Available Atraumatic extraction methods are deemed to be important to minimize alveolar bone loss after tooth extraction. With the advent of such techniques, exodontia is no more a dreaded procedure in anxious patients. Newer system and techniques for extraction of teeth have evolved in the recent few decades. This article reviews and discusses new techniques to make simple and complex exodontias more efficient with improved patient outcomes. This includes physics forceps, powered periotome, piezosurgery, benex extractor, sonic instrument for bone surgery, lasers.

  10. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  11. Applying cognitive developmental psychology to middle school physics learning: The rule assessment method

    Science.gov (United States)

    Hallinen, Nicole R.; Chi, Min; Chin, Doris B.; Prempeh, Joe; Blair, Kristen P.; Schwartz, Daniel L.

    2013-01-01

    Cognitive developmental psychology often describes children's growing qualitative understanding of the physical world. Physics educators may be able to use the relevant methods to advantage for characterizing changes in students' qualitative reasoning. Siegler developed the "rule assessment" method for characterizing levels of qualitative understanding for two factor situations (e.g., volume and mass for density). The method assigns children to rule levels that correspond to the degree they notice and coordinate the two factors. Here, we provide a brief tutorial plus a demonstration of how we have used this method to evaluate instructional outcomes with middle-school students who learned about torque, projectile motion, and collisions using different instructional methods with simulations.

  12. Comparative exergy analyses of Jatropha curcas oil extraction methods: Solvent and mechanical extraction processes

    International Nuclear Information System (INIS)

    Ofori-Boateng, Cynthia; Keat Teong, Lee; JitKang, Lim

    2012-01-01

    Highlights: ► Exergy analysis detects locations of resource degradation within a process. ► Solvent extraction is six times exergetically destructive than mechanical extraction. ► Mechanical extraction of jatropha oil is 95.93% exergetically efficient. ► Solvent extraction of jatropha oil is 79.35% exergetically efficient. ► Exergy analysis of oil extraction processes allow room for improvements. - Abstract: Vegetable oil extraction processes are found to be energy intensive. Thermodynamically, any energy intensive process is considered to degrade the most useful part of energy that is available to produce work. This study uses literature values to compare the efficiencies and degradation of the useful energy within Jatropha curcas oil during oil extraction taking into account solvent and mechanical extraction methods. According to this study, J. curcas seeds on processing into J. curcas oil is upgraded with mechanical extraction but degraded with solvent extraction processes. For mechanical extraction, the total internal exergy destroyed is 3006 MJ which is about six times less than that for solvent extraction (18,072 MJ) for 1 ton J. curcas oil produced. The pretreatment processes of the J. curcas seeds recorded a total internal exergy destructions of 5768 MJ accounting for 24% of the total internal exergy destroyed for solvent extraction processes and 66% for mechanical extraction. The exergetic efficiencies recorded are 79.35% and 95.93% for solvent and mechanical extraction processes of J. curcas oil respectively. Hence, mechanical oil extraction processes are exergetically efficient than solvent extraction processes. Possible improvement methods are also elaborated in this study.

  13. Solvent extraction in analytical chemistry of tungsten (Review)

    International Nuclear Information System (INIS)

    Ivanov, V.M.; Busev, A.I.; Sokolova, T.A.

    1975-01-01

    The use of extraction for isolating and concentrating tungsten with subsequent determination by various methods is considered. For tungsten extractants of all types are employed: neutral, basic and acidic. Neutral extractants are used for isolating and concentrating tungsten, basic and acidic ones are employed, as a rule, for the isolation and subsequent determination of tungsten. This type of extractants is highly promising, since, selectively extracting tungsten, they allow its simultaneous determination. Neutral extractants are oxygen-containing solvents, TBP; basic extractants are aniline, pyridine, 1-naphthylamine, trialkylbenzylammoniumanitrate. As acidic reagents use is made of 8-oxyquinoline and its derivatives, oximes and hydroxamic acids, β-diketones, carbaminates. In the extraction radioactive isotope 185 W is employed

  14. Association rule extraction from XML stream data for wireless sensor networks.

    Science.gov (United States)

    Paik, Juryon; Nam, Junghyun; Kim, Ung Mo; Won, Dongho

    2014-07-18

    With the advances of wireless sensor networks, they yield massive volumes of disparate, dynamic and geographically-distributed and heterogeneous data. The data mining community has attempted to extract knowledge from the huge amount of data that they generate. However, previous mining work in WSNs has focused on supporting simple relational data structures, like one table per network, while there is a need for more complex data structures. This deficiency motivates XML, which is the current de facto format for the data exchange and modeling of a wide variety of data sources over the web, to be used in WSNs in order to encourage the interchangeability of heterogeneous types of sensors and systems. However, mining XML data for WSNs has two challenging issues: one is the endless data flow; and the other is the complex tree structure. In this paper, we present several new definitions and techniques related to association rule mining over XML data streams in WSNs. To the best of our knowledge, this work provides the first approach to mining XML stream data that generates frequent tree items without any redundancy.

  15. A Numerical Comparison of Rule Ensemble Methods and Support Vector Machines

    Energy Technology Data Exchange (ETDEWEB)

    Meza, Juan C.; Woods, Mark

    2009-12-18

    Machine or statistical learning is a growing field that encompasses many scientific problems including estimating parameters from data, identifying risk factors in health studies, image recognition, and finding clusters within datasets, to name just a few examples. Statistical learning can be described as 'learning from data' , with the goal of making a prediction of some outcome of interest. This prediction is usually made on the basis of a computer model that is built using data where the outcomes and a set of features have been previously matched. The computer model is called a learner, hence the name machine learning. In this paper, we present two such algorithms, a support vector machine method and a rule ensemble method. We compared their predictive power on three supernova type 1a data sets provided by the Nearby Supernova Factory and found that while both methods give accuracies of approximately 95%, the rule ensemble method gives much lower false negative rates.

  16. Isolation and Identification of Volatile Components in Tempe by Simultaneous Distillation-Extraction Method by Modified Extraction Method

    Directory of Open Access Journals (Sweden)

    Syahrial Syahrial

    2010-06-01

    Full Text Available An isolation and identification of volatile components in temps for 2, 5 and 8 days fermentation by simultaneous distillation-extraction method was carried out. Simultaneous distillation-extraction apparatus was modified by Muchalal from the basic Likens-Nickerson's design. Steam distillation and benzena as an extraction solvent was used in this system. The isolation was continuously carried out for 3 hours which maximum water temperature In the Liebig condenser was 8 °C. The extract was concentrated by freeze concentration method, and the volatile components were analyzed and identified by combined gas chromatography-mass spectrophotometry (GC-MS. The Muchalal's simultaneous distillation extraction apparatus have some disadvantage in cold finger condenser, and it's extractor did not have condenser. At least 47, 13 and 5 volatile components were found in 2, 5 and 8 days fermentation, respectively. The volatile components in the 2 days fermentation were nonalal, ɑ-pinene, 2,4-decadienal, 5-phenyldecane, 5-phenylundecane, 4-phenylundecane, 5-phenyldodecane, 4-phenyldodecane, 3-phenyldodecane, 2-phenyldodecane, 5-phenyltridecane, and caryophyllene; in the 5 days fermentation were nonalal, caryophyllene, 4-phenylundecane, 5-phenyldodecane, 4-phenyldodecane, 3-phenyldodecane, 2-phenyldodecane; and in the 8 days fermentation were ethenyl butanoic, 2-methy1-3-(methylethenylciclohexyl etanoic and 3,7-dimethyl-5-octenyl etanoic.

  17. A Bayesian analysis of the nucleon QCD sum rules

    International Nuclear Information System (INIS)

    Ohtani, Keisuke; Gubler, Philipp; Oka, Makoto

    2011-01-01

    QCD sum rules of the nucleon channel are reanalyzed, using the maximum-entropy method (MEM). This new approach, based on the Bayesian probability theory, does not restrict the spectral function to the usual ''pole + continuum'' form, allowing a more flexible investigation of the nucleon spectral function. Making use of this flexibility, we are able to investigate the spectral functions of various interpolating fields, finding that the nucleon ground state mainly couples to an operator containing a scalar diquark. Moreover, we formulate the Gaussian sum rule for the nucleon channel and find that it is more suitable for the MEM analysis to extract the nucleon pole in the region of its experimental value, while the Borel sum rule does not contain enough information to clearly separate the nucleon pole from the continuum. (orig.)

  18. Methods for determination of extractable complex composition

    International Nuclear Information System (INIS)

    Sergievskij, V.V.

    1984-01-01

    Specific features and restrictions of main methods for determining the extractable complex composition by the distribution data (methods of equilibrium shift, saturation, mathematical models) are considered. Special attention is given to the solution of inverse problems with account for hydration effect on the activity of organic phase components. By example of the systems lithium halides-isoamyl alcohol, thorium nitrate-n-hexyl alcohol, mineral acids tri-n-butyl phosphate (TBP), metal nitrates (uranium lanthanides) - TBP the results on determining stoichiometry of extraction equilibria obtained by various methods are compared

  19. Extraction Methods for the Isolation of Isoflavonoids from Plant Material

    Directory of Open Access Journals (Sweden)

    Blicharski Tomasz

    2017-03-01

    Full Text Available The purpose of this review is to describe and compare selected traditional and modern extraction methods employed in the isolation of isoflavonoids from plants. Conventional methods such as maceration, percolation, or Soxhlet extraction are still frequently used in phytochemical analysis. Despite their flexibility, traditional extraction techniques have significant drawbacks, including the need for a significant investment of time, energy, and starting material, and a requirement for large amounts of potentially toxic solvents. Moreover, these techniques are difficult to automate, produce considerable amount of waste and pose a risk of degradation of thermolabile compounds. Modern extraction methods, such as: ultrasound-assisted extraction, microwave-assisted extraction, accelerated solvent extraction, supercritical fluid extraction, and negative pressure cavitation extraction, can be regarded as remedies for the aforementioned problems. This manuscript discusses the use of the most relevant extraction techniques in the process of isolation of isoflavonoids, secondary metabolites that have been found to have a plethora of biological and pharmacological activities.

  20. Pressurised liquid extraction of flavonoids in onions. Method development and validation

    DEFF Research Database (Denmark)

    Søltoft, Malene; Christensen, J.H.; Nielsen, J.

    2009-01-01

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids...... extraction methods. However. PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light......-step PLE method showed good selectivity, precision (RSDs = 3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots....

  1. Extraction of uranium from simulated ore by the supercritical carbon dioxide fluid extraction method with nitric acid-TBP complex

    International Nuclear Information System (INIS)

    Dung, Le Thi Kim; Imai, Tomoki; Tomioka, Osamu; Nakashima, Mikio; Takahashi, Kuniaki; Meguro, Yoshihiro

    2006-01-01

    The supercritical fluid extraction (SFE) method using CO 2 as a medium with an extractant of HNO 3 -tri-n-butyl phosphate (TBP) complex was applied to extract uranium from several uranyl phosphate compounds and simulated uranium ores. An extraction method consisting of a static extraction process and a dynamic one was established, and the effects of the experimental conditions, such as pressure, temperature, and extraction time, on the extraction of uranium were ascertained. It was found that uranium could be efficiently extracted from both the uranyl phosphates and simulated ores by the SFE method using CO 2 . It was thus demonstrated that the SFE method using CO 2 is useful as a pretreatment method for the analysis of uranium in ores. (author)

  2. Mixed arrays - problems with current methods and rules

    International Nuclear Information System (INIS)

    Mennerdahl, D.

    1987-01-01

    Simplified methods are used to control the criticality safety of mixed arrays (non-identical units) in storage or in transport. The basis for these methods is that the analyses of arrays of identical units are sufficient for drawing proper conclusions on mixed arrays. In a recent study of the rules for transport, two general flaws in such methods have been identified. One flaw is caused by increased neutron return rate to the central part of the array. The other flaw is caused by increased neutron coupling between two or more fissile units in an array. In both cases, replacement of fissile units with other units, which appear to be less reactive, can lead to criticality. This paper shows that the two flaws are common in also in current methods used for storage of fissile materials. (author)

  3. The extraction of essential oil from patchouli leaves (Pogostemon cablin Benth) using microwave hydrodistillation and solvent-free microwave extraction methods

    Science.gov (United States)

    Putri, D. K. Y.; Kusuma, H. S.; Syahputra, M. E.; Parasandi, D.; Mahfud, M.

    2017-12-01

    Patchouli plant (Pogostemon cablin Benth) is one of the important essential oil-producing plant, contributes more than 50% of total exports of Indonesia’s essential oil. However, the extraction of patchouli oil that has been done in Indonesia is generally still used conventional methods that require enormous amount of energy, high solvent usage, and long time of extraction. Therefore, in this study, patchouli oil extraction was carried out by using microwave hydrodistillation and solvent-free microwave extraction methods. Based on this research, it is known that the extraction of patchouli oil using microwave hydrodistillation method with longer extraction time (240 min) only produced patchouli oil’s yield 1.2 times greater than solvent-free microwave extraction method which require faster extraction time (120 min). Otherwise the analysis of electric consumption and the environmental impact, the solvent-free microwave extraction method showed a smaller amount when compared with microwave hydrodistillation method. It is conclude that the use of solvent-free microwave extraction method for patchouli oil extraction is suitably method as a new green technique.

  4. A methodology for extracting knowledge rules from artificial neural networks applied to forecast demand for electric power; Uma metodologia para extracao de regras de conhecimento a partir de redes neurais artificiais aplicadas para previsao de demanda por energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Steinmetz, Tarcisio; Souza, Glauber; Ferreira, Sandro; Santos, Jose V. Canto dos; Valiati, Joao [Universidade do Vale do Rio dos Sinos (PIPCA/UNISINOS), Sao Leopoldo, RS (Brazil). Programa de Pos-Graduacao em Computacao Aplicada], Emails: trsteinmetz@unisinos.br, gsouza@unisinos.br, sferreira, jvcanto@unisinos.br, jfvaliati@unisinos.br

    2009-07-01

    We present a methodology for the extraction of rules from Artificial Neural Networks (ANN) trained to forecast the electric load demand. The rules have the ability to express the knowledge regarding the behavior of load demand acquired by the ANN during the training process. The rules are presented to the user in an easy to read format, such as IF premise THEN consequence. Where premise relates to the input data submitted to the ANN (mapped as fuzzy sets), and consequence appears as a linear equation describing the output to be presented by the ANN, should the premise part holds true. Experimentation demonstrates the method's capacity for acquiring and presenting high quality rules from neural networks trained to forecast electric load demand for several amounts of time in the future. (author)

  5. Development of an extraction method for perchlorate in soils.

    Science.gov (United States)

    Cañas, Jaclyn E; Patel, Rashila; Tian, Kang; Anderson, Todd A

    2006-03-01

    Perchlorate originates as a contaminant in the environment from its use in solid rocket fuels and munitions. The current US EPA methods for perchlorate determination via ion chromatography using conductivity detection do not include recommendations for the extraction of perchlorate from soil. This study evaluated and identified appropriate conditions for the extraction of perchlorate from clay loam, loamy sand, and sandy soils. Based on the results of this evaluation, soils should be extracted in a dry, ground (mortar and pestle) state with Milli-Q water in a 1 ratio 1 soil ratio water ratio and diluted no more than 5-fold before analysis. When sandy soils were extracted in this manner, the calculated method detection limit was 3.5 microg kg(-1). The findings of this study have aided in the establishment of a standardized extraction method for perchlorate in soil.

  6. Association Rule Extraction from XML Stream Data for Wireless Sensor Networks

    Science.gov (United States)

    Paik, Juryon; Nam, Junghyun; Kim, Ung Mo; Won, Dongho

    2014-01-01

    With the advances of wireless sensor networks, they yield massive volumes of disparate, dynamic and geographically-distributed and heterogeneous data. The data mining community has attempted to extract knowledge from the huge amount of data that they generate. However, previous mining work in WSNs has focused on supporting simple relational data structures, like one table per network, while there is a need for more complex data structures. This deficiency motivates XML, which is the current de facto format for the data exchange and modeling of a wide variety of data sources over the web, to be used in WSNs in order to encourage the interchangeability of heterogeneous types of sensors and systems. However, mining XML data for WSNs has two challenging issues: one is the endless data flow; and the other is the complex tree structure. In this paper, we present several new definitions and techniques related to association rule mining over XML data streams in WSNs. To the best of our knowledge, this work provides the first approach to mining XML stream data that generates frequent tree items without any redundancy. PMID:25046017

  7. Method of purifying phosphoric acid after solvent extraction

    International Nuclear Information System (INIS)

    Kouloheris, A.P.; Lefever, J.A.

    1979-01-01

    A method of purifying phosphoric acid after solvent extraction is described. The phosphoric acid is contacted with a sorbent which sorbs or takes up the residual amount of organic carrier and the phosphoric acid separated from the organic carrier-laden sorbent. The method is especially suitable for removing residual organic carrier from phosphoric acid after solvent extraction uranium recovery. (author)

  8. Extraction methods of Amaranthus sp. grain oil isolation.

    Science.gov (United States)

    Krulj, Jelena; Brlek, Tea; Pezo, Lato; Brkljača, Jovana; Popović, Sanja; Zeković, Zoran; Bodroža Solarov, Marija

    2016-08-01

    Amaranthus sp. is a fast-growing crop with well-known beneficial nutritional values (rich in protein, fat, dietary fiber, ash, and minerals, especially calcium and sodium, and containing a higher amount of lysine than conventional cereals). Amaranthus sp. is an underexploited plant source of squalene, a compound of high importance in the food, cosmetic and pharmaceutical industries. This paper has examined the effects of the different extraction methods (Soxhlet, supercritical fluid and accelerated solvent extraction) on the oil and squalene yield of three genotypes of Amaranthus sp. grain. The highest yield of the extracted oil (78.1 g kg(-1) ) and squalene (4.7 g kg(-1) ) in grain was obtained by accelerated solvent extraction (ASE) in genotype 16. Post hoc Tukey's HSD test at 95% confidence limit showed significant differences between observed samples. Principal component analysis (PCA) and cluster analysis (CA) were used for assessing the effect of different genotypes and extraction methods on oil and squalene yield, and also the fatty acid composition profile. Using coupled PCA and CA of observed samples, possible directions for improving the quality of product can be realized. The results of this study indicate that it is very important to choose both the right genotype and the right method of extraction for optimal oil and squalene yield. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  9. A Karnaugh-Map based fingerprint minutiae extraction method

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Fingerprint is one of the most promising method among all the biometric techniques and has been used for thepersonal authentication for a long time because of its wide acceptance and reliability. Features (Minutiae are extracted fromthe fingerprint in question and are compared with the features already stored in the database for authentication. Crossingnumber (CN is the most commonly used minutiae extraction method for fingerprints. In this paper, a new Karnaugh-Mapbased fingerprint minutiae extraction method has been proposed and discussed. In the proposed algorithm the 8 neighborsof a pixel in a 33 window are arranged as 8 bits of a byte and corresponding hexadecimal (hex value is calculated. Thesehex values are simplified using standard Karnaugh-Map (K-map technique to obtain the minimized logical expression.Experiments conducted on the FVC2002/Db1_a database reveals that the developed method is better than the crossingnumber (CN method.

  10. Improved model reduction and tuning of fractional-order PI(λ)D(μ) controllers for analytical rule extraction with genetic programming.

    Science.gov (United States)

    Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava

    2012-03-01

    Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Inorganic arsenic in seafood: does the extraction method matter?

    Science.gov (United States)

    Pétursdóttir, Ásta H; Gunnlaugsdóttir, Helga; Krupp, Eva M; Feldmann, Jörg

    2014-05-01

    Nine different extraction methods were evaluated for three seafood samples to test whether the concentration of inorganic arsenic (iAs) determined in seafood is dependent on the extraction method. Certified reference materials (CRM) DOLT-4 (Dogfish Liver) and TORT-2 (Lobster Hepatopancreas), and a commercial herring fish meal were evaluated. All experimental work described here was carried out by the same operator using the same instrumentation, thus eliminating possible differences in results caused by laboratory related factors. Low concentrations of iAs were found in CRM DOLT-4 (0.012±0.003mgkg(-1)) and the herring fish meal sample (0.007±0.002mgkg(-1)) for all extraction methods. When comparing the concentration of iAs in CRM TORT-2 found in this study and in the literature dilute acids, HNO3 and HCl, showed the highest extracted iAs wheras dilute NaOH (in 50% ethanol) showed significantly lower extracted iAs. However, most other extraction solvents were not statistically different from one another. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Methods for microbial DNA extraction from soil for PCR amplification

    Directory of Open Access Journals (Sweden)

    Yeates C

    1998-01-01

    Full Text Available Amplification of DNA from soil is often inhibited by co-purified contaminants. A rapid, inexpensive, large-scale DNA extraction method involving minimal purification has been developed that is applicable to various soil types (1. DNA is also suitable for PCR amplification using various DNA targets. DNA was extracted from 100g of soil using direct lysis with glass beads and SDS followed by potassium acetate precipitation, polyethylene glycol precipitation, phenol extraction and isopropanol precipitation. This method was compared to other DNA extraction methods with regard to DNA purity and size.

  13. A fast, simple and green method for the extraction of carbamate pesticides from rice by microwave assisted steam extraction coupled with solid phase extraction.

    Science.gov (United States)

    Song, Weitao; Zhang, Yiqun; Li, Guijie; Chen, Haiyan; Wang, Hui; Zhao, Qi; He, Dong; Zhao, Chun; Ding, Lan

    2014-01-15

    This paper presented a fast, simple and green sample pretreatment method for the extraction of 8 carbamate pesticides in rice. The carbamate pesticides were extracted by microwave assisted water steam extraction method, and the extract obtained was immediately applied on a C18 solid phase extraction cartridge for clean-up and concentration. The eluate containing target compounds was finally analysed by high performance liquid chromatography with mass spectrometry. The parameters affecting extraction efficiency were investigated and optimised. The limits of detection ranging from 1.1 to 4.2ngg(-1) were obtained. The recoveries of 8 carbamate pesticides ranged from 66% to 117% at three spiked levels, and the inter- and intra-day relative standard deviation values were less than 9.1%. Compared with traditional methods, the proposed method cost less extraction time and organic solvent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain.

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2013-08-12

    A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.

  15. Raft cultivation area extraction from high resolution remote sensing imagery by fusing multi-scale region-line primitive association features

    Science.gov (United States)

    Wang, Min; Cui, Qi; Wang, Jie; Ming, Dongping; Lv, Guonian

    2017-01-01

    In this paper, we first propose several novel concepts for object-based image analysis, which include line-based shape regularity, line density, and scale-based best feature value (SBV), based on the region-line primitive association framework (RLPAF). We then propose a raft cultivation area (RCA) extraction method for high spatial resolution (HSR) remote sensing imagery based on multi-scale feature fusion and spatial rule induction. The proposed method includes the following steps: (1) Multi-scale region primitives (segments) are obtained by image segmentation method HBC-SEG, and line primitives (straight lines) are obtained by phase-based line detection method. (2) Association relationships between regions and lines are built based on RLPAF, and then multi-scale RLPAF features are extracted and SBVs are selected. (3) Several spatial rules are designed to extract RCAs within sea waters after land and water separation. Experiments show that the proposed method can successfully extract different-shaped RCAs from HR images with good performance.

  16. Calendar methods of fertility regulation: a rule of thumb.

    Science.gov (United States)

    Colombo, B; Scarpa, B

    1996-01-01

    "[Many] illiterate women, particularly in the third world, find [it] difficult to apply usual calendar methods for the regulation of fertility. Some of them are even unable to make simple subtractions. In this paper we are therefore trying to evaluate the applicability and the efficiency of an extremely simple rule which entails only [the ability to count] a number of days, and always the same way." (SUMMARY IN ITA) excerpt

  17. Fast Reduction Method in Dominance-Based Information Systems

    Science.gov (United States)

    Li, Yan; Zhou, Qinghua; Wen, Yongchuan

    2018-01-01

    In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.

  18. Exploiting graph kernels for high performance biomedical relation extraction.

    Science.gov (United States)

    Panyam, Nagesh C; Verspoor, Karin; Cohn, Trevor; Ramamohanarao, Kotagiri

    2018-01-30

    Relation extraction from biomedical publications is an important task in the area of semantic mining of text. Kernel methods for supervised relation extraction are often preferred over manual feature engineering methods, when classifying highly ordered structures such as trees and graphs obtained from syntactic parsing of a sentence. Tree kernels such as the Subset Tree Kernel and Partial Tree Kernel have been shown to be effective for classifying constituency parse trees and basic dependency parse graphs of a sentence. Graph kernels such as the All Path Graph kernel (APG) and Approximate Subgraph Matching (ASM) kernel have been shown to be suitable for classifying general graphs with cycles, such as the enhanced dependency parse graph of a sentence. In this work, we present a high performance Chemical-Induced Disease (CID) relation extraction system. We present a comparative study of kernel methods for the CID task and also extend our study to the Protein-Protein Interaction (PPI) extraction task, an important biomedical relation extraction task. We discuss novel modifications to the ASM kernel to boost its performance and a method to apply graph kernels for extracting relations expressed in multiple sentences. Our system for CID relation extraction attains an F-score of 60%, without using external knowledge sources or task specific heuristic or rules. In comparison, the state of the art Chemical-Disease Relation Extraction system achieves an F-score of 56% using an ensemble of multiple machine learning methods, which is then boosted to 61% with a rule based system employing task specific post processing rules. For the CID task, graph kernels outperform tree kernels substantially, and the best performance is obtained with APG kernel that attains an F-score of 60%, followed by the ASM kernel at 57%. The performance difference between the ASM and APG kernels for CID sentence level relation extraction is not significant. In our evaluation of ASM for the PPI task, ASM

  19. Effect of extraction method and orientin content on radio-protective effect of tulsi extracts

    Energy Technology Data Exchange (ETDEWEB)

    Tiwari, Mrinalini; Dwarakanath, B. S.; Agrawala, Paban K., E-mail: pkagrawal@gmail.com [Institute of Nuclear Medicine and Allied Sciences, Delhi (India); Murugan, R.; Parimelazhagan, T. [Department of Botany, Bharathiar University, Coimbatore (India); Uma Devi, P. [ARA-B-3SA, Plavilakonam,Trivandrum (India); Gota, V.; Sarin, R. K. [Advanced Centre for Treatment Research and Education in Cancer, Navi Mumbai (India)

    2014-07-01

    Extract of tulsi leaves (Ocimum sanctum) has been reported for its radioprotective efficacy. In our initial studies we observed significant variation in the survival of irradiated mice with different batches of tulsi extracts and therefore we employed different extraction methods on leaves collected during various seasons from different localities to study any variation in the radioprotective efficacy. Orientin, a component of tulsi extract, was considered a marker. Mice whole body survival (at 10 Gy lethal whole body irradiation) study and day 11 endo-CFU-s assay (at 5 Gy WBI) were performed employing 3 treatment schedules, 50 mg/kg or 25 mg/kg b.w (single injection, 30 min irradiation), and 10 mg/kgb.w (one injection per day for 5 day, last injection being 30 min before irradiation). Single dose of 25 mg/kg b.w (both aqueous and alcoholic) did not provide any significant survival benefit. The orientin concentrations in the extracts tested varied from 3.3 to 9.91 mg/g extract as studied by HPLC method. With a single administration (i.p) of 50 mg/kg, the aqueous extract from leaves of monsoon season had an orientin content of 9.91 mg/g extract and gave a survival of 60% with a CFU-s count of 37, while extract of leaf summer leaves had an orientin content of 4.15 mg/g extract and gave a survival of 50% with a CFU-s count of 11.6. At the same dose (50 mg/kg), the aqueous extract from the winter season had an orientin content of 3.30 mg/g extract and gave 25% survival with a CFU-s count of 19, while the ethanolic extract had an orientin content of 7.70 mg/g extract and gave a survival of 50% with a CFU-s count of 13. These observations suggest that different climatic factors, orientin content and the doses of administration are important factors regulating radioprotection afforded by different extracts of tulsi. (author)

  20. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Xiong, D

    2001-01-01

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  1. Effect of mucin extraction method on some properties of ...

    African Journals Online (AJOL)

    Effect of mucin extraction method on some properties of metronidazole mucoadhesive loaded patches. MI Arhewoh, SO Eraga, PF Builders, MA Ibobiri. Abstract. To evaluate the effects of mucin extraction method and plasticizer concentration on the bioadhesive strength and metronidazole release profile from mucin-based ...

  2. BMAA extraction of cyanobacteria samples: which method to choose?

    Science.gov (United States)

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  3. Social network extraction based on Web: 1. Related superficial methods

    Science.gov (United States)

    Khairuddin Matyuso Nasution, Mahyuddin

    2018-01-01

    Often the nature of something affects methods to resolve the related issues about it. Likewise, methods to extract social networks from the Web, but involve the structured data types differently. This paper reveals several methods of social network extraction from the same sources that is Web: the basic superficial method, the underlying superficial method, the description superficial method, and the related superficial methods. In complexity we derive the inequalities between methods and so are their computations. In this case, we find that different results from the same tools make the difference from the more complex to the simpler: Extraction of social network by involving co-occurrence is more complex than using occurrences.

  4. A Two-Stage Optimization Strategy for Fuzzy Object-Based Analysis Using Airborne LiDAR and High-Resolution Orthophotos for Urban Road Extraction

    Directory of Open Access Journals (Sweden)

    Maher Ibrahim Sameen

    2017-01-01

    Full Text Available In the last decade, object-based image analysis (OBIA has been extensively recognized as an effective classification method for very high spatial resolution images or integrated data from different sources. In this study, a two-stage optimization strategy for fuzzy object-based analysis using airborne LiDAR was proposed for urban road extraction. The method optimizes the two basic steps of OBIA, namely, segmentation and classification, to realize accurate land cover mapping and urban road extraction. This objective was achieved by selecting the optimum scale parameter to maximize class separability and the optimum shape and compactness parameters to optimize the final image segments. Class separability was maximized using the Bhattacharyya distance algorithm, whereas image segmentation was optimized using the Taguchi method. The proposed fuzzy rules were created based on integrated data and expert knowledge. Spectral, spatial, and texture features were used under fuzzy rules by implementing the particle swarm optimization technique. The proposed fuzzy rules were easy to implement and were transferable to other areas. An overall accuracy of 82% and a kappa index of agreement (KIA of 0.79 were achieved on the studied area when results were compared with reference objects created via manual digitization in a geographic information system. The accuracy of road extraction using the developed fuzzy rules was 0.76 (producer, 0.85 (user, and 0.72 (KIA. Meanwhile, overall accuracy was decreased by approximately 6% when the rules were applied on a test site. A KIA of 0.70 was achieved on the test site using the same rules without any changes. The accuracy of the extracted urban roads from the test site was 0.72 (KIA, which decreased to approximately 0.16. Spatial information (i.e., elongation and intensity from LiDAR were the most interesting properties for urban road extraction. The proposed method can be applied to a wide range of real applications

  5. Method of extraction under pressure of fossil material

    Energy Technology Data Exchange (ETDEWEB)

    Fredenmark, G L

    1942-02-24

    A method is described of extraction under pressure of fossil material, such as coal, brown coal (lignite), peat, oil shale. It is characterized by carrying out the process of extraction by utilization of fractions of shale oils and/or peat tar with a boiling point above 170/sup 0/C and under such as pressure that the medium of extraction is in a liquid state.

  6. Proof of Kochen–Specker Theorem: Conversion of Product Rule to Sum Rule

    International Nuclear Information System (INIS)

    Toh, S.P.; Zainuddin, Hishamuddin

    2009-01-01

    Valuation functions of observables in quantum mechanics are often expected to obey two constraints called the sum rule and product rule. However, the Kochen–Specker (KS) theorem shows that for a Hilbert space of quantum mechanics of dimension d ≤ 3, these constraints contradict individually with the assumption of value definiteness. The two rules are not irrelated and Peres [Found. Phys. 26 (1996) 807] has conceived a method of converting the product rule into a sum rule for the case of two qubits. Here we apply this method to a proof provided by Mermin based on the product rule for a three-qubit system involving nine operators. We provide the conversion of this proof to one based on sum rule involving ten operators. (general)

  7. Influence of different extraction methods on the yield and linalool content of the extracts of Eugenia uniflora L.

    Science.gov (United States)

    Galhiane, Mário S; Rissato, Sandra R; Chierice, Gilberto O; Almeida, Marcos V; Silva, Letícia C

    2006-09-15

    This work has been developed using a sylvestral fruit tree, native to the Brazilian forest, the Eugenia uniflora L., one of the Mirtaceae family. The main goal of the analytical study was focused on extraction methods themselves. The method development pointed to the Clevenger extraction as the best yield in relation to SFE and Soxhlet. The SFE method presented a good yield but showed a big amount of components in the final extract, demonstrating low selectivity. The essential oil extracted was analyzed by GC/FID showing a large range of polarity and boiling point compounds, where linalool, a widely used compound, was identified. Furthermore, an analytical solid phase extraction method was used to clean it up and obtain separated classes of compounds that were fractionated and studied by GC/FID and GC/MS.

  8. Recommendation System Based On Association Rules For Distributed E-Learning Management Systems

    Science.gov (United States)

    Mihai, Gabroveanu

    2015-09-01

    Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.

  9. A rapid and low-cost DNA extraction method for isolating ...

    African Journals Online (AJOL)

    The price of commercial DNA extraction methods makes the routine use of polymerase chain reaction amplification (PCR) based methods rather costly for scientists in developing countries. A guanidium thiocayante-based DNA extraction method was investigated in this study for the isolation of Escherichia coli (E. coli) DNA ...

  10. Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation

    NARCIS (Netherlands)

    Martzoukos, S.; Costa Florêncio, C.; Monz, C.; Kornai, A.; Kuhlmann, M.

    2013-01-01

    The consistency method has been established as the standard strategy for extracting high quality translation rules in statistical machine translation (SMT). However, no attention has been drawn to why this method is successful, other than empirical evidence. Using concepts from graph theory, we

  11. Study of QCD medium by sum rules

    Energy Technology Data Exchange (ETDEWEB)

    Mallik, S [Saha Institute of Nuclear Physics, Calcutta (India)

    1998-08-01

    Though it has no analogue in condensed matter physics, the thermal QCD sum rules can, nevertheless, answer questions of condensed matter type about the QCD medium. The ingredients needed to write such sum rules, viz. the operator product expansion and the spectral representation at finite temperature, are reviewed in detail. The sum rules are then actually written for the case of correlation function of two vector currents. Collecting information on the thermal average of the higher dimension operators from other sources, we evaluate these sum rules for the temperature dependent {rho}-meson parameters. Possibility of extracting more information from the combined set of all sum rules from different correlation functions is also discussed. (author) 30 refs., 2 figs.

  12. Advanced analytical method of nereistoxin using mixed-mode cationic exchange solid-phase extraction and GC/MS.

    Science.gov (United States)

    Park, Yujin; Choe, Sanggil; Lee, Heesang; Jo, Jiyeong; Park, Yonghoon; Kim, Eunmi; Pyo, Jaesung; Jung, Jee H

    2015-07-01

    Nereistoxin(NTX) was originated from a marine annelid worm Lumbriconereis heteropoda and its analogue pesticides including cartap, bensultap, thiocyclam and thiobensultap have been commonly used in agriculture, because of their low toxicity and high insecticidal activity. However, NTX has been reported about its inhibitory neuro toxicity in human and animal body, by blocking nicotinic acetylcholine receptor and it cause significant neuromuscular toxicity, resulting in respiratory failure. We developed a new method to determine NTX in biological fluid. The method involves mixed-mode cationic exchange based solid phase extraction and gas chromatography/mass spectrometry for final identification and quantitative analysis. The limit of detection and recovery were substantially better than those of other methods using liquid-liquid extraction or headspace solid phase microextraction. The good recoveries (97±14%) in blood samples were obtained and calibration curves over the range 0.05-20 mg/L have R2 values greater than 0.99. The developed method was applied to a fatal case of cartap intoxication of 74 years old woman who ingested cartap hydrochloride for suicide. Cartap and NTX were detected from postmortem specimens and the cause of the death was ruled to be nereistoxin intoxication. The concentrations of NTX were 2.58 mg/L, 3.36 mg/L and 1479.7 mg/L in heart, femoral blood and stomach liquid content, respectively. The heart blood/femoral blood ratio of NTX was 0.76. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Accelerated H-LBP-based edge extraction method for digital radiography

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, Shuang; Zhao, Chen-yi; Huang, Ji-peng [School of Physics, Northeast Normal University, Changchun 130024 (China); Sun, Jia-ning, E-mail: sunjn118@nenu.edu.cn [School of Mathematics and Statistics, Northeast Normal University, Changchun 130024 (China)

    2015-01-11

    With the goal of achieving real time and efficient edge extraction for digital radiography, an accelerated H-LBP-based edge extraction method (AH-LBP) is presented in this paper by improving the existing framework of local binary pattern with the H function (H-LBP). Since the proposed method avoids computationally expensive operations with no loss of quality, it possesses much lower computational complexity than H-LBP. Experimental results on real radiographies show desirable performance of our method. - Highlights: • An accelerated H-LBP method for edge extraction on digital radiography is proposed. • The novel AH-LBP relies on numerical analysis of the existing H-LBP method. • Aiming at accelerating, H-LBP is reformulated as a direct binary processing. • AH-LBP provides the same edge extraction result as H-LBP does. • AH-LBP has low computational complexity satisfying real time requirements.

  14. Comparison of the methods for tissue triiodothyronine T(3) extraction and subsequent radioimmunoassay

    International Nuclear Information System (INIS)

    Takaishi, M.; Miyachi, Y.; Aoki, M.; Shishiba, Y.; Asahi Life Foundation, Tokyo

    1978-01-01

    Although there have been various reports on tissue T 3 concentration, the examination of the quality of radioimmunoassay has not been available. In the present study, we tried to determine whether the available methods for T 3 extraction are adequate for the various methods of T 3 radioimmunoassays used. T 3 was extracted from liver by ethanol extraction or by acid butanol extraction (Flock's method) and the extract was applied to radioimmunoassay either by Seralute T 3 column, ANS-double antibody or the ANS-charcoal method. The values of T 3 were compared with those obtained by isotope-equilibration method. The dilution curve of ethanol extract was not parallel with that of the standard in ANS-charcoal or ANS-double antibody technique. When the extract was tested by Seralate method, the dilution curve was parallel to the standard, whereas the T 3 value obtained with this method was two-fold higher than that with the isotope equilibration technique. The analysis of the ethanol extract suggested that the lipid extracted by ethanol interfered with the assay. The acid butanol extract when tested either by the ANS-double antibody or Seralate method, showed parallelism to the standard curve and gave T 3 values almost identical with those by the isotope-equilibration method. When tested by ANS-charcoal method, the dilution curve of the acid butanol extract was not parallel to the standard. Thus, to obtain reliable results, tissue extraction by Flock's method and subsequent T 3 radioimmunoassay by either ANS-double antibody or Seralate T 3 method are recommended. (author)

  15. Assessment of proposed electromagnetic quantum vacuum energy extraction methods

    OpenAIRE

    Moddel, Garret

    2009-01-01

    In research articles and patents several methods have been proposed for the extraction of zero-point energy from the vacuum. None has been reliably demonstrated, but the proposals remain largely unchallenged. In this paper the feasibility of these methods is assessed in terms of underlying thermodynamics principles of equilibrium, detailed balance, and conservation laws. The methods are separated into three classes: nonlinear processing of the zero-point field, mechanical extraction using Cas...

  16. Analyzing Divisia Rules Extracted from a Feedforward Neural Network

    National Research Council Canada - National Science Library

    Schmidt, Vincent A; Binner, Jane M

    2006-01-01

    This paper introduces a mechanism for generating a series of rules that characterize the money-price relationship, defined as the relationship between the rate of growth of the money supply and inflation...

  17. Influence of Extraction Methods on the Yield of Steviol Glycosides and Antioxidants in Stevia rebaudiana Extracts.

    Science.gov (United States)

    Periche, Angela; Castelló, Maria Luisa; Heredia, Ana; Escriche, Isabel

    2015-06-01

    This study evaluated the application of ultrasound techniques and microwave energy, compared to conventional extraction methods (high temperatures at atmospheric pressure), for the solid-liquid extraction of steviol glycosides (sweeteners) and antioxidants (total phenols, flavonoids and antioxidant capacity) from dehydrated Stevia leaves. Different temperatures (from 50 to 100 °C), times (from 1 to 40 min) and microwave powers (1.98 and 3.30 W/g extract) were used. There was a great difference in the resulting yields according to the treatments applied. Steviol glycosides and antioxidants were negatively correlated; therefore, there is no single treatment suitable for obtaining the highest yield in both groups of compounds simultaneously. The greatest yield of steviol glycosides was obtained with microwave energy (3.30 W/g extract, 2 min), whereas, the conventional method (90 °C, 1 min) was the most suitable for antioxidant extraction. Consequently, the best process depends on the subsequent use (sweetener or antioxidant) of the aqueous extract of Stevia leaves.

  18. How far away is far enough for extracting numerical waveforms, and how much do they depend on the extraction method?

    International Nuclear Information System (INIS)

    Pazos, Enrique; Dorband, Ernst Nils; Nagar, Alessandro; Palenzuela, Carlos; Schnetter, Erik; Tiglio, Manuel

    2007-01-01

    We present a method for extracting gravitational waves from numerical spacetimes which generalizes and refines one of the standard methods based on the Regge-Wheeler-Zerilli perturbation formalism. At the analytical level, this generalization allows a much more general class of slicing conditions for the background geometry, and is thus not restricted to Schwarzschild-like coordinates. At the numerical level, our approach uses high-order multi-block methods, which improve both the accuracy of our simulations and of our extraction procedure. In particular, the latter is simplified since there is no need for interpolation, and we can afford to extract accurate waves at large radii with only little additional computational effort. We then present fully nonlinear three-dimensional numerical evolutions of a distorted Schwarzschild black hole in Kerr-Schild coordinates with an odd parity perturbation and analyse the improvement that we gain from our generalized wave extraction, comparing our new method to the standard one. In particular, we analyse in detail the quasinormal frequencies of the extracted waves, using both methods. We do so by comparing the extracted waves with one-dimensional high resolution solutions of the corresponding generalized Regge-Wheeler equation. We explicitly see that the errors in the waveforms extracted with the standard method at fixed, finite extraction radii do not converge to zero with increasing resolution. We find that even with observers as far out as R = 80M-which is larger than what is commonly used in state-of-the-art simulations-the assumption in the standard method that the background is close to having Schwarzschild-like coordinates increases the error in the extracted waves considerably. Furthermore, those errors are dominated by the extraction method itself and not by the accuracy of our simulations. For extraction radii between 20M and 80M and for the resolutions that we use in this paper, our new method decreases the errors

  19. How far away is far enough for extracting numerical waveforms, and how much do they depend on the extraction method?

    Energy Technology Data Exchange (ETDEWEB)

    Pazos, Enrique [Department of Physics and Astronomy, 202 Nicholson Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Dorband, Ernst Nils [Department of Physics and Astronomy, 202 Nicholson Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Nagar, Alessandro [Dipartimento di Fisica, Politecnico di Torino, Corso Duca Degli Abruzzi 24, 10129 Torino (Italy); Palenzuela, Carlos [Department of Physics and Astronomy, 202 Nicholson Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Schnetter, Erik [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Tiglio, Manuel [Department of Physics and Astronomy, 202 Nicholson Hall, Louisiana State University, Baton Rouge, LA 70803 (United States)

    2007-06-21

    We present a method for extracting gravitational waves from numerical spacetimes which generalizes and refines one of the standard methods based on the Regge-Wheeler-Zerilli perturbation formalism. At the analytical level, this generalization allows a much more general class of slicing conditions for the background geometry, and is thus not restricted to Schwarzschild-like coordinates. At the numerical level, our approach uses high-order multi-block methods, which improve both the accuracy of our simulations and of our extraction procedure. In particular, the latter is simplified since there is no need for interpolation, and we can afford to extract accurate waves at large radii with only little additional computational effort. We then present fully nonlinear three-dimensional numerical evolutions of a distorted Schwarzschild black hole in Kerr-Schild coordinates with an odd parity perturbation and analyse the improvement that we gain from our generalized wave extraction, comparing our new method to the standard one. In particular, we analyse in detail the quasinormal frequencies of the extracted waves, using both methods. We do so by comparing the extracted waves with one-dimensional high resolution solutions of the corresponding generalized Regge-Wheeler equation. We explicitly see that the errors in the waveforms extracted with the standard method at fixed, finite extraction radii do not converge to zero with increasing resolution. We find that even with observers as far out as R = 80M-which is larger than what is commonly used in state-of-the-art simulations-the assumption in the standard method that the background is close to having Schwarzschild-like coordinates increases the error in the extracted waves considerably. Furthermore, those errors are dominated by the extraction method itself and not by the accuracy of our simulations. For extraction radii between 20M and 80M and for the resolutions that we use in this paper, our new method decreases the errors

  20. Critical assessment of extracellular polymeric substances extraction methods from mixed culture biomass

    DEFF Research Database (Denmark)

    Pellicer i Nàcher, Carles; Domingo Felez, Carlos; Mutlu, Ayten Gizem

    2013-01-01

    . This study presents a rigorous and critical assessment of existing physical and chemical EPS extraction methods applied to mixed-culture biomass samples (nitrifying, nitritation-anammox, and activated sludge biomass). A novel fluorescence-based method was developed and calibrated to quantify the lysis...... potential of different EPS extraction protocols. We concluded that commonly used methods to assess cell lysis (DNA concentrations or G6PDH activities in EPS extracts) do not correlate with cell viability. Furthermore, we discovered that the presence of certain chemicals in EPS extracts results in severe...... underestimation of protein and carbohydrate concentrations by using standard analytical methods. Keeping both maximum EPS extraction yields and minimal biomass lysis as criteria, it was identified a sonication-based extraction method as the best to determine and compare tightly-bound EPS fractions in different...

  1. A SEMI-AUTOMATIC RULE SET BUILDING METHOD FOR URBAN LAND COVER CLASSIFICATION BASED ON MACHINE LEARNING AND HUMAN KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    H. Y. Gu

    2017-09-01

    Full Text Available Classification rule set is important for Land Cover classification, which refers to features and decision rules. The selection of features and decision are based on an iterative trial-and-error approach that is often utilized in GEOBIA, however, it is time-consuming and has a poor versatility. This study has put forward a rule set building method for Land cover classification based on human knowledge and machine learning. The use of machine learning is to build rule sets effectively which will overcome the iterative trial-and-error approach. The use of human knowledge is to solve the shortcomings of existing machine learning method on insufficient usage of prior knowledge, and improve the versatility of rule sets. A two-step workflow has been introduced, firstly, an initial rule is built based on Random Forest and CART decision tree. Secondly, the initial rule is analyzed and validated based on human knowledge, where we use statistical confidence interval to determine its threshold. The test site is located in Potsdam City. We utilised the TOP, DSM and ground truth data. The results show that the method could determine rule set for Land Cover classification semi-automatically, and there are static features for different land cover classes.

  2. Improved Cole parameter extraction based on the least absolute deviation method

    International Nuclear Information System (INIS)

    Yang, Yuxiang; Ni, Wenwen; Sun, Qiang; Wen, He; Teng, Zhaosheng

    2013-01-01

    The Cole function is widely used in bioimpedance spectroscopy (BIS) applications. Fitting the measured BIS data onto the model and then extracting the Cole parameters (R 0 , R ∞ , α and τ) is a common practice. Accurate extraction of the Cole parameters from the measured BIS data has great significance for evaluating the physiological or pathological status of biological tissue. The traditional least-squares (LS)-based curve fitting method for Cole parameter extraction is often sensitive to noise or outliers and becomes non-robust. This paper proposes an improved Cole parameter extraction based on the least absolute deviation (LAD) method. Comprehensive simulation experiments are carried out and the performances of the LAD method are compared with those of the LS method under the conditions of outliers, random noises and both disturbances. The proposed LAD method exhibits much better robustness under all circumstances, which demonstrates that the LAD method is deserving as an improved alternative to the LS method for Cole parameter extraction for its robustness to outliers and noises. (paper)

  3. Optimization of conventional rule curves coupled with hedging rules for reservoir operation

    DEFF Research Database (Denmark)

    Taghian, Mehrdad; Rosbjerg, Dan; Haghighi, Ali

    2014-01-01

    As a common approach to reservoir operating policies, water levels at the end of each time interval should be kept at or above the rule curve. In this study, the policy is captured using rationing of the target yield to reduce the intensity of severe water shortages. For this purpose, a hybrid...... to achieve the optimal water allocation and the target storage levels for reservoirs. As a case study, a multipurpose, multireservoir system in southern Iran is selected. The results show that the model has good performance in extracting the optimum policy for reservoir operation under both normal...... model is developed to optimize simultaneously both the conventional rule curve and the hedging rule. In the compound model, a simple genetic algorithm is coupled with a simulation program, including an inner linear programming algorithm. In this way, operational policies are imposed by priority concepts...

  4. The method for simultaneous extraction and back extraction in liquid three-phase system and equipment for simultaneous extraction and back extraction in liquid three-phase system

    International Nuclear Information System (INIS)

    Palyska, W.; Chmielewski, A.G.

    1992-01-01

    The method for simultaneous extraction and back extraction in liquid three-phase system has been worked out. The equipment designed for that process has been also subject of the patent. The interesting component is extracted first to intermediate phase consists of magnetic solvent keeping two extracting phases separately. The intermediate magnetic liquid has been kept in its position using a stable magnet maintained on the surface of the extraction vessel. Then the component pass from intermediate phase to the third phase as a result of back extraction. Mixing in the extraction and back extraction zones is organized by means of rotating shaft going along the whole apparatus. The extraction and back extraction processes occur simultaneously as a result of continuous flow of solvent in their zones. The single extraction back extraction facilities can be joined in larger batteries. 3 figs

  5. An expert system design to diagnose cancer by using a new method reduced rule base.

    Science.gov (United States)

    Başçiftçi, Fatih; Avuçlu, Emre

    2018-04-01

    A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby

  6. Comparison of protein extraction methods suitable for proteomics ...

    African Journals Online (AJOL)

    An efficient protein extraction method is a prerequisite for successful implementation of proteomics. In this study, seedling roots of Jerusalem artichoke were treated with the concentration of 250 mM NaCl for 36 h. Subsequently, six different protocols of protein extraction were applied to seedling roots of Jerusalem artichoke ...

  7. Optimizing pressurized liquid extraction of microbial lipids using the response surface method.

    Science.gov (United States)

    Cescut, J; Severac, E; Molina-Jouve, C; Uribelarrea, J-L

    2011-01-21

    Response surface methodology (RSM) was used for the determination of optimum extraction parameters to reach maximum lipid extraction yield with yeast. Total lipids were extracted from oleaginous yeast (Rhodotorula glutinis) using pressurized liquid extraction (PLE). The effects of extraction parameters on lipid extraction yield were studied by employing a second-order central composite design. The optimal condition was obtained as three cycles of 15 min at 100°C with a ratio of 144 g of hydromatrix per 100 g of dry cell weight. Different analysis methods were used to compare the optimized PLE method with two conventional methods (Soxhlet and modification of Bligh and Dyer methods) under efficiency, selectivity and reproducibility criteria thanks to gravimetric analysis, GC with flame ionization detector, High Performance Liquid Chromatography linked to Evaporative Light Scattering Detector (HPLC-ELSD) and thin-layer chromatographic analysis. For each sample, the lipid extraction yield with optimized PLE was higher than those obtained with referenced methods (Soxhlet and Bligh and Dyer methods with, respectively, a recovery of 78% and 85% compared to PLE method). Moreover, the use of PLE led to major advantages such as an analysis time reduction by a factor of 10 and solvent quantity reduction by 70%, compared with traditional extraction methods. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. A simple and efficient total genomic DNA extraction method for individual zooplankton.

    Science.gov (United States)

    Fazhan, Hanafiah; Waiho, Khor; Shahreza, Md Sheriff

    2016-01-01

    Molecular approaches are widely applied in species identification and taxonomic studies of minute zooplankton. One of the most focused zooplankton nowadays is from Subclass Copepoda. Accurate species identification of all life stages of the generally small sized copepods through molecular analysis is important, especially in taxonomic and systematic assessment of harpacticoid copepod populations and to understand their dynamics within the marine community. However, total genomic DNA (TGDNA) extraction from individual harpacticoid copepods can be problematic due to their small size and epibenthic behavior. In this research, six TGDNA extraction methods done on individual harpacticoid copepods were compared. The first new simple, feasible, efficient and consistent TGDNA extraction method was designed and compared with the commercial kit and modified available TGDNA extraction methods. The newly described TGDNA extraction method, "Incubation in PCR buffer" method, yielded good and consistent results based on the high success rate of PCR amplification (82%) compared to other methods. Coupled with its relatively consistent and economical method the "Incubation in PCR buffer" method is highly recommended in the TGDNA extraction of other minute zooplankton species.

  9. Application of a rule extraction algorithm family based on the Re-RX algorithm to financial credit risk assessment from a Pareto optimal perspective

    Directory of Open Access Journals (Sweden)

    Yoichi Hayashi

    2016-01-01

    Full Text Available Historically, the assessment of credit risk has proved to be both highly important and extremely difficult. Currently, financial institutions rely on the use of computer-generated credit scores for risk assessment. However, automated risk evaluations are currently imperfect, and the loss of vast amounts of capital could be prevented by improving the performance of computerized credit assessments. A number of approaches have been developed for the computation of credit scores over the last several decades, but these methods have been considered too complex without good interpretability and have therefore not been widely adopted. Therefore, in this study, we provide the first comprehensive comparison of results regarding the assessment of credit risk obtained using 10 runs of 10-fold cross validation of the Re-RX algorithm family, including the Re-RX algorithm, the Re-RX algorithm with both discrete and continuous attributes (Continuous Re-RX, the Re-RX algorithm with J48graft, the Re-RX algorithm with a trained neural network (Sampling Re-RX, NeuroLinear, NeuroLinear+GRG, and three unique rule extraction techniques involving support vector machines and Minerva from four real-life, two-class mixed credit-risk datasets. We also discuss the roles of various newly-extended types of the Re-RX algorithm and high performance classifiers from a Pareto optimal perspective. Our findings suggest that Continuous Re-RX, Re-RX with J48graft, and Sampling Re-RX comprise a powerful management tool that allows the creation of advanced, accurate, concise and interpretable decision support systems for credit risk evaluation. In addition, from a Pareto optimal perspective, the Re-RX algorithm family has superior features in relation to the comprehensibility of extracted rules and the potential for credit scoring with Big Data.

  10. Gain ratio based fuzzy weighted association rule mining classifier for ...

    Indian Academy of Sciences (India)

    association rule mining algorithm for extracting both association rules and member- .... The disadvantage of this work is in considering the generalization at each ... If the new attribute is entered, the generalization process does not consider the ...

  11. Comparison of extraction methods for quantifying vitamin E from animal tissues.

    Science.gov (United States)

    Xu, Zhimin

    2008-12-01

    Four extraction methods: (1) solvent (SOL), (2) ultrasound assisted solvent (UA), (3) saponification and solvent (SP), and (4) saponification and ultrasound assisted solvent (SP-UA), were used in sample preparation for quantifying vitamin E (tocopherols) in chicken liver and plasma samples. The extraction yields of SOL, UA, SP, and SP-UA methods obtained by adding delta-tocopherol as internal reference were 95%, 104%, 65%, and 62% for liver and 98%, 103%, 97%, and 94% for plasma, respectively. The methods with saponification significantly affected the stabilities of tocopherols in liver samples. The measured values of alpha- and gamma-tocopherols using the solvent only extraction (SOL) method were much lower than that using any of the other extraction methods. This indicated that less of the tocopherols in those samples were in a form that could be extracted directly by solvent. The measured value of alpha-tocopherol in the liver sample using the ultrasound assisted solvent (UA) method was 1.5-2.5 times of that obtained from the saponification and solvent (SP) method. The differences in measured values of tocopherols in the plasma samples by using the two methods were not significant. However, the measured value of the saponification and ultrasound assisted solvent (SP-UA) method was lower than either the saponification and solvent (SP) or the ultrasound assisted solvent (UA) method. Also, the reproducibility of the ultrasound assisted solvent (UA) method was greater than any of the saponification methods. Compared with the traditional saponification method, the ultrasound assisted solvent method could effectively extract tocopherols from sample matrix without any chemical degradation reactions, especially for complex animal tissue such as liver.

  12. Using Machine Learning Methods Jointly to Find Better Set of Rules in Data Mining

    Directory of Open Access Journals (Sweden)

    SUG Hyontai

    2017-01-01

    Full Text Available Rough set-based data mining algorithms are one of widely accepted machine learning technologies because of their strong mathematical background and capability of finding optimal rules based on given data sets only without room for prejudiced views to be inserted on the data. But, because the algorithms find rules very precisely, we may confront with the overfitting problem. On the other hand, association rule algorithms find rules of association, where the association resides between sets of items in database. The algorithms find itemsets that occur more than given minimum support, so that they can find the itemsets practically in reasonable time even for very large databases by supplying the minimum support appropriately. In order to overcome the problem of the overfitting problem in rough set-based algorithms, first we find large itemsets, after that we select attributes that cover the large itemsets. By using the selected attributes only, we may find better set of rules based on rough set theory. Results from experiments support our suggested method.

  13. Clinical study of Atopic Dermatitis patient treated with Poison Extraction Method

    Directory of Open Access Journals (Sweden)

    Park Chi-young

    2007-06-01

    Full Text Available Objectives : This study is desinged in order to evaluate the Poison extraction method for the Atopic dermatitis. Methods : The authors observed the two cases of Atopic dermatitis patients who previously used steroid-based ointment. for treating the Poison Extraction Method. Conclusions : 1. In case 1, the patient with mild case of Atopic dermatitis in face is treated with the Poison extraction method. Rash symptoms intensed for first few days. As sweating appeared in the local area from the seventh day, all the symptoms was disappeared rapidly. No recurrence was found. 2. In case 2, the patient with severe case of Atopic dermatitis in whole body is treated with the Poison extraction method. The symptoms intensed after two months and all the symptoms of itchiness, rash, scaly letter dissapeared in the third and fourth months. No recurrence was found. 3. In both cases of mild and severe cases of Atopic dermatitis. all the symptoms were disappeared and no recurrence was found with the Poison Extraction Method.

  14. Excavation-drier method of energy-peat extraction reduces long-term climatic impact

    Energy Technology Data Exchange (ETDEWEB)

    Silvan, N.; Silvan, K.; Laine, J. [Finnish Forest Research Inst., Parkano (Finland)], e-mail: niko.silvan@metla.fi; Vaisanen, S.; Soukka, R. [Lappeenranta Univ.of Techology (Finland)

    2012-11-01

    Climatic impacts of energy-peat extraction are of increasing concern due to EU emissions trading requirements. A new excavation-drier peat extraction method has been developed to reduce the climatic impact and increase the efficiency of peat extraction. To quantify and compare the soil GHG fluxes of the excavation drier and the traditional milling methods, as well as the areas from which the energy peat is planned to be extracted in the future (extraction reserve area types), soil CO{sub 2}, CH{sub 4} and N{sub 2}O fluxes were measured during 2006-2007 at three sites in Finland. Within each site, fluxes were measured from drained extraction reserve areas, extraction fields and stockpiles of both methods and additionally from the biomass driers of the excavation-drier method. The Life Cycle Assessment (LCA), described at a principal level in ISO Standards 14040:2006 and 14044:2006, was used to assess the long-term (100 years) climatic impact from peatland utilisation with respect to land use and energy production chains where utilisation of coal was replaced with peat. Coal was used as a reference since in many cases peat and coal can replace each other in same power plants. According to this study, the peat extraction method used was of lesser significance than the extraction reserve area type in regards to the climatic impact. However, the excavation-drier method seems to cause a slightly reduced climatic impact as compared with the prevailing milling method. (orig.)

  15. A Novel Method of Interestingness Measures for Association Rules Mining Based on Profit

    Directory of Open Access Journals (Sweden)

    Chunhua Ju

    2015-01-01

    Full Text Available Association rules mining is an important topic in the domain of data mining and knowledge discovering. Some papers have presented several interestingness measure methods; the most typical are Support, Confidence, Lift, Improve, and so forth. But their limitations are obvious, like no objective criterion, lack of statistical base, disability of defining negative relationship, and so forth. This paper proposes three new methods, Bi-lift, Bi-improve, and Bi-confidence, for Lift, Improve, and Confidence, respectively. Then, on the basis of utility function and the executing cost of rules, we propose interestingness function based on profit (IFBP considering subjective preferences and characteristics of specific application object. Finally, a novel measure framework is proposed to improve the traditional one through experimental analysis. In conclusion, the new methods and measure framework are prior to the traditional ones in the aspects of objective criterion, comprehensive definition, and practical application.

  16. Effects of Extraction Method on the Physicochemical and ...

    African Journals Online (AJOL)

    The effects of improved method of extraction on the physicochemical, mycological and stability of crude Canarium Schweinfurthii fruit oil were studied. The extracted oils were then stored at 25±5oC for 24 months with samples analyzed at 6months interval for; pH, saponification value, acid value, peroxide value and iodine ...

  17. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  18. Comparative study of methods for extraction and purification of ...

    African Journals Online (AJOL)

    USER

    2010-08-02

    Aug 2, 2010 ... and or enzymatic lysis for direct or indirect extraction of. DNA followed by ... strength wastewater sludge in order to determine the best. DNA extraction protocol ... Ammonium acetate purification method was used to remove the.

  19. METHOD OF RARE TERM CONTRASTIVE EXTRACTION FROM NATURAL LANGUAGE TEXTS

    Directory of Open Access Journals (Sweden)

    I. A. Bessmertny

    2017-01-01

    Full Text Available The paper considers a problem of automatic domain term extraction from documents corpus by means of a contrast collection. Existing contrastive methods successfully extract often used terms but mishandle rare terms. This could yield poorness of the resulting thesaurus. Assessment of point-wise mutual information is one of the known statistical methods of term extraction and it finds rare terms successfully. Although, it extracts many false terms at that. The proposed approach consists of point-wise mutual information application for rare terms extraction and filtering of candidates by criterion of joint occurrence with the other candidates. We build “documents-by-terms” matrix that is subjected to singular value decomposition to eliminate noise and reveal strong interconnections. Then we pass on to the resulting matrix “terms-by-terms” that reproduces strength of interconnections between words. This approach was approved on a documents collection from “Geology” domain with the use of contrast documents from such topics as “Politics”, “Culture”, “Economics” and “Accidents” on some Internet resources. The experimental results demonstrate operability of this method for rare terms extraction.

  20. Improving KPCA Online Extraction by Orthonormalization in the Feature Space.

    Science.gov (United States)

    Souza Filho, Joao B O; Diniz, Paulo S R

    2018-04-01

    Recently, some online kernel principal component analysis (KPCA) techniques based on the generalized Hebbian algorithm (GHA) were proposed for use in large data sets, defining kernel components using concise dictionaries automatically extracted from data. This brief proposes two new online KPCA extraction algorithms, exploiting orthogonalized versions of the GHA rule. In both the cases, the orthogonalization of kernel components is achieved by the inclusion of some low complexity additional steps to the kernel Hebbian algorithm, thus not substantially affecting the computational cost of the algorithm. Results show improved convergence speed and accuracy of components extracted by the proposed methods, as compared with the state-of-the-art online KPCA extraction algorithms.

  1. Decoding rule search domain in the left inferior frontal gyrus

    Science.gov (United States)

    Babcock, Laura; Vallesi, Antonino

    2018-01-01

    Traditionally, the left hemisphere has been thought to extract mainly verbal patterns of information, but recent evidence has shown that the left Inferior Frontal Gyrus (IFG) is active during inductive reasoning in both the verbal and spatial domains. We aimed to understand whether the left IFG supports inductive reasoning in a domain-specific or domain-general fashion. To do this we used Multi-Voxel Pattern Analysis to decode the representation of domain during a rule search task. Thirteen participants were asked to extract the rule underlying streams of letters presented in different spatial locations. Each rule was either verbal (letters forming words) or spatial (positions forming geometric figures). Our results show that domain was decodable in the left prefrontal cortex, suggesting that this region represents domain-specific information, rather than processes common to the two domains. A replication study with the same participants tested two years later confirmed these findings, though the individual representations changed, providing evidence for the flexible nature of representations. This study extends our knowledge on the neural basis of goal-directed behaviors and on how information relevant for rule extraction is flexibly mapped in the prefrontal cortex. PMID:29547623

  2. DNA extraction methods for detecting genetically modified foods: A comparative study.

    Science.gov (United States)

    Elsanhoty, Rafaat M; Ramadan, Mohamed Fawzy; Jany, Klaus Dieter

    2011-06-15

    The work presented in this manuscript was achieved to compare six different methods for extracting DNA from raw maize and its derived products. The methods that gave higher yield and quality of DNA were chosen to detect the genetic modification in the samples collected from the Egyptian market. The different methods used were evaluated for extracting DNA from maize kernels (without treatment), maize flour (mechanical treatment), canned maize (sweet corn), frozen maize (sweet corn), maize starch, extruded maize, popcorn, corn flacks, maize snacks, and bread made from corn flour (mechanical and thermal treatments). The quality and quantity of the DNA extracted from the standards, containing known percentages of GMO material and from the different food products were evaluated. For qualitative detection of the GMO varieties in foods, the GMOScreen 35S/NOS test kit was used, to screen the genetic modification in the samples. The positive samples for the 35S promoter and/or the NOS terminator were identified by the standard methods adopted by EU. All of the used methods extracted yielded good DNA quality. However, we noted that the purest DNA extract were obtained using the DNA extraction kit (Roche) and this generally was the best method for extracting DNA from most of the maize-derived foods. We have noted that the yield of DNA extracted from maize-derived foods was generally lower in the processed products. The results indicated that 17 samples were positive for the presence of 35S promoter, while 34% from the samples were positive for the genetically modified maize line Bt-176. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. DNA extraction method for PCR in mycorrhizal fungi.

    Science.gov (United States)

    Manian, S; Sreenivasaprasad, S; Mills, P R

    2001-10-01

    To develop a simple and rapid DNA extraction protocol for PCR in mycorrhizal fungi. The protocol combines the application of rapid freezing and boiling cycles and passage of the extracts through DNA purification columns. PCR amplifiable DNA was obtained from a number of endo- and ecto-mycorrhizal fungi using minute quantities of spores and mycelium, respectively. DNA extracted following the method, was used to successfully amplify regions of interest from high as well as low copy number genes. The amplicons were suitable for further downstream applications such as sequencing and PCR-RFLPs. The protocol described is simple, short and facilitates rapid isolation of PCR amplifiable genomic DNA from a large number of fungal isolates in a single day. The method requires only minute quantities of starting material and is suitable for mycorrhizal fungi as well as a range of other fungi.

  4. Evaluation of sample extraction methods for proteomics analysis of green algae Chlorella vulgaris.

    Science.gov (United States)

    Gao, Yan; Lim, Teck Kwang; Lin, Qingsong; Li, Sam Fong Yau

    2016-05-01

    Many protein extraction methods have been developed for plant proteome analysis but information is limited on the optimal protein extraction method from algae species. This study evaluated four protein extraction methods, i.e. direct lysis buffer method, TCA-acetone method, phenol method, and phenol/TCA-acetone method, using green algae Chlorella vulgaris for proteome analysis. The data presented showed that phenol/TCA-acetone method was superior to the other three tested methods with regards to shotgun proteomics. Proteins identified using shotgun proteomics were validated using sequential window acquisition of all theoretical fragment-ion spectra (SWATH) technique. Additionally, SWATH provides protein quantitation information from different methods and protein abundance using different protein extraction methods was evaluated. These results highlight the importance of green algae protein extraction method for subsequent MS analysis and identification. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Rule extraction from minimal neural networks for credit card screening.

    Science.gov (United States)

    Setiono, Rudy; Baesens, Bart; Mues, Christophe

    2011-08-01

    While feedforward neural networks have been widely accepted as effective tools for solving classification problems, the issue of finding the best network architecture remains unresolved, particularly so in real-world problem settings. We address this issue in the context of credit card screening, where it is important to not only find a neural network with good predictive performance but also one that facilitates a clear explanation of how it produces its predictions. We show that minimal neural networks with as few as one hidden unit provide good predictive accuracy, while having the added advantage of making it easier to generate concise and comprehensible classification rules for the user. To further reduce model size, a novel approach is suggested in which network connections from the input units to this hidden unit are removed by a very straightaway pruning procedure. In terms of predictive accuracy, both the minimized neural networks and the rule sets generated from them are shown to compare favorably with other neural network based classifiers. The rules generated from the minimized neural networks are concise and thus easier to validate in a real-life setting.

  6. [Comparison of two nucleic acid extraction methods for norovirus in oysters].

    Science.gov (United States)

    Yuan, Qiao; Li, Hui; Deng, Xiaoling; Mo, Yanling; Fang, Ling; Ke, Changwen

    2013-04-01

    To explore a convenient and effective method for norovirus nucleic acid extraction from oysters suitable for long-term viral surveillance. Two methods, namely method A (glycine washing and polyethylene glycol precipitation of the virus followed by silica gel centrifugal column) and method B (protease K digestion followed by application of paramagnetic silicon) were compared for their performance in norovirus nucleic acid extraction from oysters. Real-time RT-PCR was used to detect norovirus in naturally infected oysters and in oysters with induced infection. The two methods yielded comparable positive detection rates for the samples, but the recovery rate of the virus was higher with method B than with method A. Method B is a more convenient and rapid method for norovirus nucleic acid extraction from oysters and suitable for long-term surveillance of norovirus.

  7. Extraction of events and rules of land use/cover change from the policy text

    Science.gov (United States)

    Lin, Guangfa; Xia, Beicheng; Huang, Wangli; Jiang, Huixian; Chen, Youfei

    2007-06-01

    The database of recording the snapshots of land parcels history is the foundation for the most of the models on simulating land use/cover change (LUCC) process. But the sequences of temporal snapshots are not sufficient to deduce and describe the mechanism of LUCC process. The temporal relationship between scenarios of LUCC we recorded could not be transfer into causal relationship categorically, which was regarded as a key factor in spatial-temporal reasoning. The proprietor of land parcels adapted themselves to the policies from governments and the change of production market, and then made decisions in this or that way. The occurrence of each change of a land parcel in an urban area was often related with one or more decision texts when it was investigated on the local scale with high resolution of the background scene. These decision texts may come from different sections of a hierarchical government system on different levels, such as villages or communities, towns or counties, cities, provinces or even the paramount. All these texts were balance results between advantages and disadvantages of different interest groups. They are the essential forces of LUCC in human dimension. Up to now, a methodology is still wanted for on how to express these forces in a simulation system using GIS as a language. The presented paper was part of our initial research on this topic. The term "Event" is a very important concept in the frame of "Object-Oriented" theory in computer science. While in the domain of temporal GIS, the concept of event was developed in another category. The definitions of the event and their transformation relationship were discussed in this paper on three modeling levels as real world level, conceptual level and programming level. In this context, with a case study of LUCC in recent 30 years in Xiamen city of Fujian province, P. R. China, the paper focused on how to extract information of events and rules from the policy files collected and integrate

  8. Highly scalable and robust rule learner: performance evaluation and comparison.

    Science.gov (United States)

    Kurgan, Lukasz A; Cios, Krzysztof J; Dick, Scott

    2006-02-01

    Business intelligence and bioinformatics applications increasingly require the mining of datasets consisting of millions of data points, or crafting real-time enterprise-level decision support systems for large corporations and drug companies. In all cases, there needs to be an underlying data mining system, and this mining system must be highly scalable. To this end, we describe a new rule learner called DataSqueezer. The learner belongs to the family of inductive supervised rule extraction algorithms. DataSqueezer is a simple, greedy, rule builder that generates a set of production rules from labeled input data. In spite of its relative simplicity, DataSqueezer is a very effective learner. The rules generated by the algorithm are compact, comprehensible, and have accuracy comparable to rules generated by other state-of-the-art rule extraction algorithms. The main advantages of DataSqueezer are very high efficiency, and missing data resistance. DataSqueezer exhibits log-linear asymptotic complexity with the number of training examples, and it is faster than other state-of-the-art rule learners. The learner is also robust to large quantities of missing data, as verified by extensive experimental comparison with the other learners. DataSqueezer is thus well suited to modern data mining and business intelligence tasks, which commonly involve huge datasets with a large fraction of missing data.

  9. The continual reassessment method: comparison of Bayesian stopping rules for dose-ranging studies.

    Science.gov (United States)

    Zohar, S; Chevret, S

    2001-10-15

    The continual reassessment method (CRM) provides a Bayesian estimation of the maximum tolerated dose (MTD) in phase I clinical trials and is also used to estimate the minimal efficacy dose (MED) in phase II clinical trials. In this paper we propose Bayesian stopping rules for the CRM, based on either posterior or predictive probability distributions that can be applied sequentially during the trial. These rules aim at early detection of either the mis-choice of dose range or a prefixed gain in the point estimate or accuracy of estimated probability of response associated with the MTD (or MED). They were compared through a simulation study under six situations that could represent the underlying unknown dose-response (either toxicity or failure) relationship, in terms of sample size, probability of correct selection and bias of the response probability associated to the MTD (or MED). Our results show that the stopping rules act correctly, with early stopping by using the two first rules based on the posterior distribution when the actual underlying dose-response relationship is far from that initially supposed, while the rules based on predictive gain functions provide a discontinuation of inclusions whatever the actual dose-response curve after 20 patients on average, that is, depending mostly on the accumulated data. The stopping rules were then applied to a data set from a dose-ranging phase II clinical trial aiming at estimating the MED dose of midazolam in the sedation of infants during cardiac catheterization. All these findings suggest the early use of the two first rules to detect a mis-choice of dose range, while they confirm the requirement of including at least 20 patients at the same dose to reach an accurate estimate of MTD (MED). A two-stage design is under study. Copyright 2001 John Wiley & Sons, Ltd.

  10. Text-in-context: a method for extracting findings in mixed-methods mixed research synthesis studies.

    Science.gov (United States)

    Sandelowski, Margarete; Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L

    2013-06-01

    Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. The data extraction challenges described here were encountered, and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011-2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. © 2012 Blackwell Publishing Ltd.

  11. Extraction method for high free radical scavenging activity of Siamese neem tree flowers

    Directory of Open Access Journals (Sweden)

    Worarat Chaisawangwong

    2009-10-01

    Full Text Available Siamese neem tree (Azadirachta indica A. Juss. var. siamensis Valeton is a medicinal plant found in Thailand. Youngleaves and young flowers of this plant are commonly consumed as a bitter tonic vegetable. The flowers are also used fortreatment of fever. The flower extract has been reported to exhibit in vitro free radical scavenging activity and can inhibitlipid peroxidation of bronchogenic cancer cell line. Active compounds in the flowers are flavonoids such as rutin andquercetin. The content of these compounds in the crude extract depends on the method of extraction. Therefore, the appropriateextraction method promoting high yield of total flavonoids and high free radical scavenging activity was investigated inthis study. Six different extraction methods, i.e. maceration, percolation, decoction, soxhlet extraction, ultrasonic extraction(UE, and microwave assisted extraction (MA were carried out for extracting dried powder of Siamese neem tree young flowers. The solvent used for maceration, percolation, and soxhlet extraction was 50% ethanol, while distilled water was used for decoction and MA, and both solvents were used for UE. The content of crude extract, free radical scavenging activity, and total flavonoids content of each extract were investigated and compared. Comparing the various extraction methods, decoction provided an extract containing a high amount of total flavonoids (17.54 mgRE/g extract and promoting the highest scavenging activity at EC50 11.36 g/ml. Decoction is also simple, cheap, and convenient and could be used in developing countries. Thus, it should be the recommended extraction method for the flowers of Siamese neem tree for furtherdevelopment of antioxidant pharmaceutical preparations.

  12. The BUME method: a new rapid and simple chloroform-free method for total lipid extraction of animal tissue.

    Science.gov (United States)

    Löfgren, Lars; Forsberg, Gun-Britt; Ståhlman, Marcus

    2016-06-10

    In this study we present a simple and rapid method for tissue lipid extraction. Snap-frozen tissue (15-150 mg) is collected in 2 ml homogenization tubes. 500 μl BUME mixture (butanol:methanol [3:1]) is added and automated homogenization of up to 24 frozen samples at a time in less than 60 seconds is performed, followed by a 5-minute single-phase extraction. After the addition of 500 μl heptane:ethyl acetate (3:1) and 500 μl 1% acetic acid a 5-minute two-phase extraction is performed. Lipids are recovered from the upper phase by automated liquid handling using a standard 96-tip robot. A second two-phase extraction is performed using 500 μl heptane:ethyl acetate (3:1). Validation of the method showed that the extraction recoveries for the investigated lipids, which included sterols, glycerolipids, glycerophospholipids and sphingolipids were similar or better than for the Folch method. We also applied the method for lipid extraction of liver and heart and compared the lipid species profiles with profiles generated after Folch and MTBE extraction. We conclude that the BUME method is superior to the Folch method in terms of simplicity, through-put, automation, solvent consumption, economy, health and environment yet delivering lipid recoveries fully comparable to or better than the Folch method.

  13. The BUME method: a new rapid and simple chloroform-free method for total lipid extraction of animal tissue

    Science.gov (United States)

    Löfgren, Lars; Forsberg, Gun-Britt; Ståhlman, Marcus

    2016-06-01

    In this study we present a simple and rapid method for tissue lipid extraction. Snap-frozen tissue (15-150 mg) is collected in 2 ml homogenization tubes. 500 μl BUME mixture (butanol:methanol [3:1]) is added and automated homogenization of up to 24 frozen samples at a time in less than 60 seconds is performed, followed by a 5-minute single-phase extraction. After the addition of 500 μl heptane:ethyl acetate (3:1) and 500 μl 1% acetic acid a 5-minute two-phase extraction is performed. Lipids are recovered from the upper phase by automated liquid handling using a standard 96-tip robot. A second two-phase extraction is performed using 500 μl heptane:ethyl acetate (3:1). Validation of the method showed that the extraction recoveries for the investigated lipids, which included sterols, glycerolipids, glycerophospholipids and sphingolipids were similar or better than for the Folch method. We also applied the method for lipid extraction of liver and heart and compared the lipid species profiles with profiles generated after Folch and MTBE extraction. We conclude that the BUME method is superior to the Folch method in terms of simplicity, through-put, automation, solvent consumption, economy, health and environment yet delivering lipid recoveries fully comparable to or better than the Folch method.

  14. New developments in FeynRules

    CERN Document Server

    Alloul, Adam; Degrande, Céline; Duhr, Claude; Fuks, Benjamin

    2014-01-01

    The program FeynRules is a Mathematica package developed to facilitate the implementation of new physics theories into high-energy physics tools. Starting from a minimal set of information such as the model gauge symmetries, its particle content, parameters and Lagrangian, FeynRules provides all necessary routines to extract automatically from the Lagrangian (that can also be computed semi-automatically for supersymmetric theories) the associated Feynman rules. These can be further exported to several Monte Carlo event generators through dedicated interfaces, as well as translated into a Python library, under the so-called UFO model format, agnostic of the model complexity, especially in terms of Lorentz and/or color structures appearing in the vertices or of number of external legs. In this work, we briefly report on the most recent new features that have been added to FeynRules, including full support for spin-3/2 fermions, a new module allowing for the automated diagonalization of the particle spectrum and...

  15. THE FACE EXTRACTION METHOD FOR MOBILE DEVICES

    Directory of Open Access Journals (Sweden)

    Viktor Borodin

    2013-10-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The problem of automatic face recognition on images is considered. The method of face ellipse extraction from photo and methods for face special points extraction are proposed /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Обычная таблица"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;}

  16. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  17. Comparison of water extraction methods in Tibet based on GF-1 data

    Science.gov (United States)

    Jia, Lingjun; Shang, Kun; Liu, Jing; Sun, Zhongqing

    2018-03-01

    In this study, we compared four different water extraction methods with GF-1 data according to different water types in Tibet, including Support Vector Machine (SVM), Principal Component Analysis (PCA), Decision Tree Classifier based on False Normalized Difference Water Index (FNDWI-DTC), and PCA-SVM. The results show that all of the four methods can extract large area water body, but only SVM and PCA-SVM can obtain satisfying extraction results for small size water body. The methods were evaluated by both overall accuracy (OAA) and Kappa coefficient (KC). The OAA of PCA-SVM, SVM, FNDWI-DTC, PCA are 96.68%, 94.23%, 93.99%, 93.01%, and the KCs are 0.9308, 0.8995, 0.8962, 0.8842, respectively, in consistent with visual inspection. In summary, SVM is better for narrow rivers extraction and PCA-SVM is suitable for water extraction of various types. As for dark blue lakes, the methods using PCA can extract more quickly and accurately.

  18. A simple and fast method for extraction and quantification of cryptophyte phycoerythrin

    DEFF Research Database (Denmark)

    Thoisen, Christina Vinum; Hansen, Benni Winding; Nielsen, Søren Laurentius

    2017-01-01

    The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive...... freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. Filters with the cryptophyte were frozen (−80 °C) and added phosphate buffer for extraction at 4 °C...... followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes. •Minimal usage of equipment and chemicals, and low labor costs. •Applicable for industrial and biological purposes....

  19. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    Science.gov (United States)

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Simple, miniaturized blood plasma extraction method.

    Science.gov (United States)

    Kim, Jin-Hee; Woenker, Timothy; Adamec, Jiri; Regnier, Fred E

    2013-12-03

    A rapid plasma extraction technology that collects a 2.5 μL aliquot of plasma within three minutes from a finger-stick derived drop of blood was evaluated. The utility of the plasma extraction cards used was that a paper collection disc bearing plasma was produced that could be air-dried in fifteen minutes and placed in a mailing envelop for transport to an analytical laboratory. This circumvents the need for venipuncture and blood collection in specialized vials by a phlebotomist along with centrifugation and refrigerated storage. Plasma extraction was achieved by applying a blood drop to a membrane stack through which plasma was drawn by capillary action. During the course of plasma migration to a collection disc at the bottom of the membrane stack blood cells were removed by a combination of adsorption and filtration. After the collection disc filled with an aliquot of plasma the upper membranes were stripped from the collection card and the collection disc was air-dried. Intercard differences in the volume of plasma collected varied approximately 1% while volume variations of less than 2% were seen with hematocrit levels ranging from 20% to 71%. Dried samples bearing metabolites and proteins were then extracted from the disc and analyzed. 25-Hydroxy vitamin D was quantified by LC-MS/MS analysis following derivatization with a secosteroid signal enhancing tag that imparted a permanent positive charge to the vitamin and reduced the limit of quantification (LOQ) to 1 pg of collected vitamin on the disc; comparable to values observed with liquid-liquid extraction (LLE) of a venipuncture sample. A similar study using conventional proteomics methods and spectral counting for quantification was conducted with yeast enolase added to serum as an internal standard. The LOQ with extracted serum samples for enolase was 1 μM, linear from 1 to 40 μM, the highest concentration examined. In all respects protein quantification with extracted serum samples was comparable to

  1. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten

    2014-01-01

    was to evaluate two different DNA extraction methods in order to choose the most efficient method for studying intestinal bacterial diversity using Denaturing Gradient Gel Electrophoresis (DGGE). FINDINGS: In this study, a semi-automatic DNA extraction system (easyMag®, BioMérieux, Marcy I'Etoile, France......BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study...... by easyMag® from the same fecal samples. Furthermore, DNA extracts obtained using easyMag® seemed to contain inhibitory compounds, since in order to perform a successful PCR-analysis, the sample should be diluted at least 10 times. DGGE performed on PCR from DNA extracted by QIAamp DNA Stool Mini Kit DNA...

  2. A simple and fast method for extraction and quantification of cryptophyte phycoerythrin

    OpenAIRE

    Thoisen, Christina; Hansen, Benni Winding; Nielsen, S?ren Laurentius

    2017-01-01

    The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equip...

  3. New method to enhance the extraction yield of rutin from Sophora japonica using a novel ultrasonic extraction system by determining optimum ultrasonic frequency.

    Science.gov (United States)

    Liao, Jianqing; Qu, Baida; Liu, Da; Zheng, Naiqin

    2015-11-01

    A new method has been proposed for enhancing extraction yield of rutin from Sophora japonica, in which a novel ultrasonic extraction system has been developed to perform the determination of optimum ultrasonic frequency by a two-step procedure. This study has systematically investigated the influence of a continuous frequency range of 20-92 kHz on rutin yields. The effects of different operating conditions on rutin yields have also been studied in detail such as solvent concentration, solvent to solid ratio, ultrasound power, temperature and particle size. A higher extraction yield was obtained at the ultrasonic frequency of 60-62 kHz which was little affected under other extraction conditions. Comparative studies between existing methods and the present method were done to verify the effectiveness of this method. Results indicated that the new extraction method gave a higher extraction yield compared with existing ultrasound-assisted extraction (UAE) and soxhlet extraction (SE). Thus, the potential use of this method may be promising for extraction of natural materials on an industrial scale in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Methods for extracting social network data from chatroom logs

    Science.gov (United States)

    Osesina, O. Isaac; McIntire, John P.; Havig, Paul R.; Geiselman, Eric E.; Bartley, Cecilia; Tudoreanu, M. Eduard

    2012-06-01

    Identifying social network (SN) links within computer-mediated communication platforms without explicit relations among users poses challenges to researchers. Our research aims to extract SN links in internet chat with multiple users engaging in synchronous overlapping conversations all displayed in a single stream. We approached this problem using three methods which build on previous research. Response-time analysis builds on temporal proximity of chat messages; word context usage builds on keywords analysis and direct addressing which infers links by identifying the intended message recipient from the screen name (nickname) referenced in the message [1]. Our analysis of word usage within the chat stream also provides contexts for the extracted SN links. To test the capability of our methods, we used publicly available data from Internet Relay Chat (IRC), a real-time computer-mediated communication (CMC) tool used by millions of people around the world. The extraction performances of individual methods and their hybrids were assessed relative to a ground truth (determined a priori via manual scoring).

  5. A simple and fast method for extraction and quantification of cryptophyte phycoerythrin.

    Science.gov (United States)

    Thoisen, Christina; Hansen, Benni Winding; Nielsen, Søren Laurentius

    2017-01-01

    The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. The cryptophyte cells on the filters were disrupted at -80 °C and added phosphate buffer for extraction at 4 °C followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes.•Minimal usage of equipment and chemicals, and low labor costs.•Applicable for industrial and biological purposes.

  6. Evaluation of five methods for total DNA extraction from western corn rootworm beetles.

    Directory of Open Access Journals (Sweden)

    Hong Chen

    Full Text Available BACKGROUND: DNA extraction is a routine step in many insect molecular studies. A variety of methods have been used to isolate DNA molecules from insects, and many commercial kits are available. Extraction methods need to be evaluated for their efficiency, cost, and side effects such as DNA degradation during extraction. METHODOLOGY/PRINCIPAL FINDINGS: From individual western corn rootworm beetles, Diabrotica virgifera virgifera, DNA extractions by the SDS method, CTAB method, DNAzol reagent, Puregene solutions and DNeasy column were compared in terms of DNA quantity and quality, cost of materials, and time consumed. Although all five methods resulted in acceptable DNA concentrations and absorbance ratios, the SDS and CTAB methods resulted in higher DNA yield (ng DNA vs. mg tissue at much lower cost and less degradation as revealed on agarose gels. The DNeasy kit was most time-efficient but was the costliest among the methods tested. The effects of ethanol volume, temperature and incubation time on precipitation of DNA were also investigated. The DNA samples obtained by the five methods were tested in PCR for six microsatellites located in various positions of the beetle's genome, and all samples showed successful amplifications. CONCLUSION/SIGNIFICANCE: These evaluations provide a guide for choosing methods of DNA extraction from western corn rootworm beetles based on expected DNA yield and quality, extraction time, cost, and waste control. The extraction conditions for this mid-size insect were optimized. The DNA extracted by the five methods was suitable for further molecular applications such as PCR and sequencing by synthesis.

  7. A Novel Lipid Extraction Method from Wet Microalga Picochlorum sp. at Room Temperature

    Directory of Open Access Journals (Sweden)

    Fangfang Yang

    2014-03-01

    Full Text Available A novel method using ethanol was proposed for extracting lipids from wet microalga Picochlorum sp. at room temperature and pressure. In this study, Central Composite design (CCD was applied to investigate the optimum conditions of lipid extraction. The results revealed that the solvent to biomass ratio had the largest effect on lipid extraction efficiency, followed by extraction time and temperature. A high lipid extraction yield (33.04% of the dry weight was obtained under the following extraction conditions: 5 mL solvents per gram of wet biomass for 37 min with gentle stirring at room temperature. The extraction yield was comparable to that obtained by the widely used Bligh-Dyer method. Furthermore, no significant differences in the distribution of lipid classes and fatty acid composition were observed according to different extraction methods. In conclusion, these results indicated that the proposed procedure using ethanol could extract lipids from wet biomass efficiently and had giant potential for lipid extraction at large scale.

  8. Determination of the antioxidant capacity of two seagrass species according to the extraction method

    Directory of Open Access Journals (Sweden)

    Kethia L. González

    2016-10-01

    Full Text Available Context: There is a wide variety of methods for obtaining of plant extracts that enable a good yield of bioactive metabolites. For several years, extractive techniques have been perfected for obtaining natural extracts with powerful pharmacological properties. Aims: To determine the influence of various extraction methods (infusion, decoction, microwave, maceration with heat and agitation, and constant heat and agitation on the content of solids, phenolic compounds and antioxidant capacity of marine angiosperms Thalassia testudinum Banks ex König (Hyrocharitacea and Syringodium filiforme kützing (Cymodoceaceae. Methods: The soluble solids content was determined by the gravimetric method; total phenolic content, using Folin-Ciocalteu method, and the antioxidant capacity by 2,2-diphenyl-1 picrylhydrazyl (DPPH method. Results: Results showed the effectiveness of extraction by decoction for T. testudinum and by microwave for S. filiforme, among the methods that use water as the extraction solvent. In the case those that use the hydroalcoholic mixture as solvent extraction, maceration with agitation and heat extraction showed the higher yields of soluble solids and total polyphenols, as well as a higher antioxidant activity for both species. Conclusions: Results showed the effectiveness of extraction by decoction for T. testudinum and by microwave for S. filiforme, among the methods that use water as extraction solvent. In the case those that use hydroalcoholic mixture as solvent extraction, maceration with agitation and heat extraction showed the higher yields of soluble solids and total polyphenols, as well as a higher antioxidant activity for both species.

  9. Extracting classification rules from an informatic security incidents repository by genetic programming

    Directory of Open Access Journals (Sweden)

    Carlos Javier Carvajal Montealegre

    2015-04-01

    Full Text Available This paper describes the data mining process to obtain classification rules over an information security incident data collection, explaining in detail the use of genetic programming as a mean to model the incidents behavior and representing such rules as decision trees. The described mining process includes several tasks, such as the GP (Genetic Programming approach evaluation, the individual's representation and the algorithm parameters tuning to upgrade the performance. The paper concludes with the result analysis and the description of the rules obtained, suggesting measures to avoid the occurrence of new informatics attacks. This paper is a part of the thesis work degree: Information Security Incident Analytics by Data Mining for Behavioral Modeling and Pattern Recognition (Carvajal, 2012.

  10. Evaluation of a rule-based method for epidemiological document classification towards the automation of systematic reviews.

    Science.gov (United States)

    Karystianis, George; Thayer, Kristina; Wolfe, Mary; Tsafnat, Guy

    2017-06-01

    Most data extraction efforts in epidemiology are focused on obtaining targeted information from clinical trials. In contrast, limited research has been conducted on the identification of information from observational studies, a major source for human evidence in many fields, including environmental health. The recognition of key epidemiological information (e.g., exposures) through text mining techniques can assist in the automation of systematic reviews and other evidence summaries. We designed and applied a knowledge-driven, rule-based approach to identify targeted information (study design, participant population, exposure, outcome, confounding factors, and the country where the study was conducted) from abstracts of epidemiological studies included in several systematic reviews of environmental health exposures. The rules were based on common syntactical patterns observed in text and are thus not specific to any systematic review. To validate the general applicability of our approach, we compared the data extracted using our approach versus hand curation for 35 epidemiological study abstracts manually selected for inclusion in two systematic reviews. The returned F-score, precision, and recall ranged from 70% to 98%, 81% to 100%, and 54% to 97%, respectively. The highest precision was observed for exposure, outcome and population (100%) while recall was best for exposure and study design with 97% and 89%, respectively. The lowest recall was observed for the population (54%), which also had the lowest F-score (70%). The generated performance of our text-mining approach demonstrated encouraging results for the identification of targeted information from observational epidemiological study abstracts related to environmental exposures. We have demonstrated that rules based on generic syntactic patterns in one corpus can be applied to other observational study design by simple interchanging the dictionaries aiming to identify certain characteristics (i.e., outcomes

  11. Comparison of DNA extraction methods for human gut microbial community profiling.

    Science.gov (United States)

    Lim, Mi Young; Song, Eun-Ji; Kim, Sang Ho; Lee, Jangwon; Nam, Young-Do

    2018-03-01

    The human gut harbors a vast range of microbes that have significant impact on health and disease. Therefore, gut microbiome profiling holds promise for use in early diagnosis and precision medicine development. Accurate profiling of the highly complex gut microbiome requires DNA extraction methods that provide sufficient coverage of the original community as well as adequate quality and quantity. We tested nine different DNA extraction methods using three commercial kits (TianLong Stool DNA/RNA Extraction Kit (TS), QIAamp DNA Stool Mini Kit (QS), and QIAamp PowerFecal DNA Kit (QP)) with or without additional bead-beating step using manual or automated methods and compared them in terms of DNA extraction ability from human fecal sample. All methods produced DNA in sufficient concentration and quality for use in sequencing, and the samples were clustered according to the DNA extraction method. Inclusion of bead-beating step especially resulted in higher degrees of microbial diversity and had the greatest effect on gut microbiome composition. Among the samples subjected to bead-beating method, TS kit samples were more similar to QP kit samples than QS kit samples. Our results emphasize the importance of mechanical disruption step for a more comprehensive profiling of the human gut microbiome. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.

  12. Coconut oil extraction by the Java method: An investigation of its potential application in aqueous Jatropha oil extraction

    NARCIS (Netherlands)

    Marasabessy, A.; Moeis, M.R.; Sanders, J.P.M.; Weusthuis, R.A.

    2010-01-01

    A traditional Java method of coconut oil extraction assisted by paddy crabs was investigated to find out if crabs or crab-derived components can be used to extract oil from Jatropha curcas seed kernels. Using the traditional Java method the addition of crab paste liberated 54% w w-1 oil from grated

  13. Direct DNA extraction method of an obligate parasitic fungus from infected plant tissue.

    Science.gov (United States)

    Liu, L; Wang, C L; Peng, W Y; Yang, J; Lan, M Q; Zhang, B; Li, J B; Zhu, Y Y; Li, C Y

    2015-12-28

    Powdery mildew and rust fungi are obligate parasites that cannot live without host organisms. They are difficult to culture in synthetic medium in the laboratory. Genomic DNA extraction is one of the basic molecular techniques used to study the genetic structure of populations. In this study, 2 different DNA extraction methods, Chelex-100 and cetyltrimethylammonium bromide (CTAB), were used to extract DNA from euonymus powdery mildew and Puccinia striiformis f. sp Tritici. Polymerase chain reaction was carried out with a race-specific-marker rDNA-internal transcribed spacer sequence. Both DNA extraction methods were compared and analyzed. The results showed that both Chelex-100 and CTAB were effective for extracting genomic DNA from infected plant tissue. However, less DNA was required for the Chelex-100 method than for the CTAB method, and the Chelex-100 method involved fewer steps, was simpler and safer, and did not require organic solvents compared to the CTAB method. DNA quality was evaluated by polymerase chain reaction, and the results showed that genomic DNA extracted using the Chelex-100 method was better than that using CTAB method, and was sufficient for studying the genetic structure of population.

  14. An efficient method for DNA extraction from Cladosporioid fungi.

    Science.gov (United States)

    Moslem, M A; Bahkali, A H; Abd-Elsalam, K A; Wit, P J G M

    2010-11-23

    We developed an efficient method for DNA extraction from Cladosporioid fungi, which are important fungal plant pathogens. The cell wall of Cladosporioid fungi is often melanized, which makes it difficult to extract DNA from their cells. In order to overcome this we grew these fungi for three days on agar plates and extracted DNA from mycelium mats after manual or electric homogenization. High-quality DNA was isolated, with an A(260)/A(280) ratio ranging between 1.6 and 2.0. Isolated genomic DNA was efficiently digested with restriction enzymes and produced distinct banding patterns on agarose gels for the different Cladosporium species. Clear DNA fragments from the isolated DNA were amplified by PCR using small and large subunit rDNA primers, demonstrating that this method provides DNA of sufficiently high quality for molecular analyses.

  15. Evaluation of the essential oil of Foeniculum vulgare Mill (fennel) fruits extracted by three different extraction methods by GC/MS.

    Science.gov (United States)

    Hammouda, Faiza M; Saleh, Mahmoud A; Abdel-Azim, Nahla S; Shams, Khaled A; Ismail, Shams I; Shahat, Abdelaaty A; Saleh, Ibrahim A

    2014-01-01

    Hydrodistillation (HD) and steam-distillation, or solvent extraction methods of essential oils have some disadvantages like thermal decomposition of extracts, its contamination with solvent or solvent residues and the pollution of residual vegetal material with solvent which can be also an environmental problem. Thus, new green techniques, such as supercritical fluid extraction and microwave assisted techniques, are potential solutions to overcome these disadvantages. The aim of this study was to evaluate the essential oil of Foeniculum vulgare subsp. Piperitum fruits extracted by three different extraction methods viz. Supercritical fluid extraction (SFE) using CO2, microwave-assisted extraction (MAE) and hydro-distillation (HD) using gas chromatography-mass spectrometry (GC/MS). The results revealed that both MAE and SFE enhanced the extraction efficiency of the interested components. MAE gave the highest yield of oil as well as higher percentage of Fenchone (28%), whereas SFE gave the highest percentage of anethol (72%). Microwave-assisted extraction (MAE) and supercritical fluid extraction (SFE) not only enhanced the essential oil extraction but also saved time, reduced the solvents use and produced, ecologically, green technologies.

  16. N-nitrosodimethylamine in drinking water using a rapid, solid-phase extraction method

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, S W.D. [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch; Koester, C J [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch; Taguchi, V Y [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch; Wang, D T [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch; Palmentier, J P.F.P. [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch; Hong, K P [Ministery of Environment and Energy, Etobicoke, ON (Canada). Lab. Services Branch

    1995-12-01

    A simple, rapid method for the extraction of N-nitrosodimethylamine (NDMA) from drinking and surface waters was developed using Ambersorb 572. Development of an alternative method to classical liquid-liquid extraction techniques was necessary to handle the workload presented by implementation of a provincial guideline of 9 ppt for drinking water and a regulatory level of 200 ppt for effluents. A granular absorbent, Ambersorb 572, was used to extract the NDMA from the water in the sample bottle. The NDMA was extracted from the Ambersorb 572 with dichloromethane in the autosampler vial. Method characteristics include a precision of 4% for replicate analyses, and accuracy of 6% at 10 ppt and a detection limit of 1.0 ppt NDMA in water. Comparative data between the Ambersorb 572 method and liquid-liquid extraction showed excellent agreement (average difference of 12%). With the Ambersorb 572 method, dichloromethane use has been reduced by a factor of 1,000 and productivity has been increased by a factor of 3-4. Monitoring of a drinking water supply showed rapidly changing concentrations of NDMA from day to day. (orig.)

  17. An efficient method for DNA extraction from Cladosporioid fungi

    OpenAIRE

    Moslem, M.A.; Bahkali, A.H.; Abd-Elsalam, K.A.; Wit, de, P.J.G.M.

    2010-01-01

    We developed an efficient method for DNA extraction from Cladosporioid fungi, which are important fungal plant pathogens. The cell wall of Cladosporioid fungi is often melanized, which makes it difficult to extract DNA from their cells. In order to overcome this we grew these fungi for three days on agar plates and extracted DNA from mycelium mats after manual or electric homogenization. High-quality DNA was isolated, with an A260/A280 ratio ranging between 1.6 and 2.0. Isolated genomic DNA w...

  18. A NOVEL WRAPPING CURVELET TRANSFORMATION BASED ANGULAR TEXTURE PATTERN (WCTATP EXTRACTION METHOD FOR WEED IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    D. Ashok Kumar

    2016-02-01

    Full Text Available Apparently weed is a major menace in crop production as it competes with crop for nutrients, moisture, space and light which resulting in poor growth and development of the crop and finally yield. Yield loss accounts for even more than 70% when crops are frown under unweeded condition with severe weed infestation. Weed management is the most significant process in the agricultural applications to improve the crop productivity rate and reduce the herbicide application cost. Existing weed detection techniques does not yield better performance due to the complex background, illumination variation and crop and weed overlapping in the agricultural field image. Hence, there arises a need for the development of effective weed identification technique. To overcome this drawback, this paper proposes a novel Wrapping Curvelet Transformation Based Angular Texture Pattern Extraction Method (WCTATP for weed identification. In our proposed work, Global Histogram Equalization (GHE is used improve the quality of the image and Adaptive Median Filter (AMF is used for filtering the impulse noise from the image. Plant image identification is performed using green pixel extraction and k-means clustering. Wrapping Curvelet transform is applied to the plant image. Feature extraction is performed to extract the angular texture pattern of the plant image. Particle Swarm Optimization (PSO based Differential Evolution Feature Selection (DEFS approach is applied to select the optimal features. Then, the selected features are learned and passed through an RVM based classifier to find out the weed. Edge detection and contouring is performed to identify the weed in the plant image. The Fuzzy rule-based approach is applied to detect the low, medium and high levels of the weed patchiness. From the experimental results, it is clearly observed that the accuracy of the proposed approach is higher than the existing Support Vector Machine (SVM based approaches. The proposed approach

  19. An automatic rat brain extraction method based on a deformable surface model.

    Science.gov (United States)

    Li, Jiehua; Liu, Xiaofeng; Zhuo, Jiachen; Gullapalli, Rao P; Zara, Jason M

    2013-08-15

    The extraction of the brain from the skull in medical images is a necessary first step before image registration or segmentation. While pre-clinical MR imaging studies on small animals, such as rats, are increasing, fully automatic imaging processing techniques specific to small animal studies remain lacking. In this paper, we present an automatic rat brain extraction method, the Rat Brain Deformable model method (RBD), which adapts the popular human brain extraction tool (BET) through the incorporation of information on the brain geometry and MR image characteristics of the rat brain. The robustness of the method was demonstrated on T2-weighted MR images of 64 rats and compared with other brain extraction methods (BET, PCNN, PCNN-3D). The results demonstrate that RBD reliably extracts the rat brain with high accuracy (>92% volume overlap) and is robust against signal inhomogeneity in the images. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Determination of {alpha}{sub s} from Gross-Llewellyn Smith sum rule by accounting for infrared renormalon

    Energy Technology Data Exchange (ETDEWEB)

    Contreras, C [Department of Physics, Universidad Tecn. Federico Santa Maria, Valparaiso (Chile); Cvetic, G [Department of Physics, Universidad Tecn. Federico Santa Maria, Valparaiso (Chile); Jeong, K S [Department of Physics, Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of); Lee, Taekoon [Department of Physics, Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of)

    2003-08-01

    We recapitulate the method which resums the truncated perturbation series of a physical observable in a way which takes into account the structure of the leading infrared renormalon. We apply the method to the Gross-Llewellyn Smith (GLS) sum rule. By confronting the obtained result with the experimentally extracted GLS value, we determine the value of the QCD coupling parameter, which turns out to agree with the present world average.

  1. Comparison of two silica-based extraction methods for DNA isolation from bones.

    Science.gov (United States)

    Rothe, Jessica; Nagy, Marion

    2016-09-01

    One of the most demanding DNA extractions is from bones and teeth due to the robustness of the material and the relatively low DNA content. The greatest challenge is due to the manifold nature of the material, which is defined by various factors, including age, storage, environmental conditions, and contamination with inhibitors. However, most published protocols do not distinguish between different types or qualities of bone material, but are described as being generally applicable. Our laboratory works with two different extraction methods based on silica membranes or the use of silica beads. We compared the amplification success of the two methods from bone samples with different qualities and in the presence of inhibitors. We found that the DNA extraction using the silica membrane method results an in higher DNA yield but also in a higher risk of co-extracting impurities, which can act as inhibitors. In contrast the silica beads method shows decreased co-extraction of inhibitors but also less DNA yield. Related to our own experiences it has to be considered that each bone material should be reviewed independently regarding the analysis and extraction method. Therefore, the most ambitious task is determining the quality of the bone material, which requires substantial experience. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Fusion rule estimation using vector space methods

    International Nuclear Information System (INIS)

    Rao, N.S.V.

    1997-01-01

    In a system of N sensors, the sensor S j , j = 1, 2 .... N, outputs Y (j) element-of Re, according to an unknown probability distribution P (Y(j) /X) , corresponding to input X element-of [0, 1]. A training n-sample (X 1 , Y 1 ), (X 2 , Y 2 ), ..., (X n , Y n ) is given where Y i = (Y i (1) , Y i (2) , . . . , Y i N ) such that Y i (j) is the output of S j in response to input X i . The problem is to estimate a fusion rule f : Re N → [0, 1], based on the sample, such that the expected square error is minimized over a family of functions Y that constitute a vector space. The function f* that minimizes the expected error cannot be computed since the underlying densities are unknown, and only an approximation f to f* is feasible. We estimate the sample size sufficient to ensure that f provides a close approximation to f* with a high probability. The advantages of vector space methods are two-fold: (a) the sample size estimate is a simple function of the dimensionality of F, and (b) the estimate f can be easily computed by well-known least square methods in polynomial time. The results are applicable to the classical potential function methods and also (to a recently proposed) special class of sigmoidal feedforward neural networks

  3. Methods for producing extracted and digested products from pretreated lignocellulosic biomass

    Science.gov (United States)

    Chundawat, Shishir; Sousa, Leonardo Da Costa; Cheh, Albert M.; Balan; , Venkatesh; Dale, Bruce

    2017-05-16

    Methods for producing extracted and digested products from pretreated lignocellulosic biomass are provided. The methods include converting native cellulose I.sub..beta. to cellulose III.sub.I by pretreating the lignocellulosic biomass with liquid ammonia under certain conditions, and performing extracting or digesting steps on the pretreated/converted lignocellulosic biomass.

  4. Improvement of extraction method of coagulation active components from Moringa oleifera seed

    OpenAIRE

    Okuda, Tetsuji; Baes, Aloysius U.; Nishijima, Wataru; Okada, Mitsumasa

    1999-01-01

    A new method for the extraction of the active coagulation component from Moringa oleifera seeds was developed and compared with the ordinary water extraction method (MOC–DW). In the new method, 1.0 mol l-1 solution of sodium chloride (MOC–SC) and other salts were used for extraction of the active coagulation component. Batch coagulation experiments were conducted using 500 ml of low turbid water (50 NTU). Coagulation efficiencies were evaluated based on the dosage required to remove kaolinite...

  5. Comparison of Heuristics for Inhibitory Rule Optimization

    KAUST Repository

    Alsolami, Fawaz

    2014-09-13

    Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.

  6. Comparison of three mycobacterial DNA extraction methods from extrapulmonary samples for PCR assay

    Directory of Open Access Journals (Sweden)

    Khandaker Shadia

    2012-01-01

    Full Text Available Sensitivity of the molecular diagnostic tests of extrapulmonary tuberculosis largely depends upon the efficiency of DNA extraction methods. The objective of our study was to compare three methods of extracting DNA of Mycobacterium tuberculosis for testing by polymerase chain reaction. All three methods; heating, heating with sonication and addition of lysis buffer with heating and sonication were implicated on 20 extrapulmonary samples. PCR positivity was 2 (10%, 4 (20% and 7 (35% in the samples extracted by heating, heat+sonication and heat+sonication+lysis buffer method respectively. Of the extraction methods evaluated, maximum PCR positive results were achieved by combined heat, sonication and lysis buffer method which can be applied in routine clinical practice. Ibrahim Med. Coll. J. 2012; 6(1: 9-11

  7. Insect lipid profile: aqueous versus organic solvent-based extraction methods

    NARCIS (Netherlands)

    Tzompa Sosa, D.A.; Yi, L.; Valenberg, van H.J.F.; Boekel, van M.A.J.S.; Lakemond, C.M.M.

    2014-01-01

    In view of future expected industrial bio-fractionation of insects, we investigated the influence of extraction methods on chemical characteristics of insect lipids. Lipids from Tenebrio molitor, Alphitobius diaperinus, Acheta domesticus and Blaptica dubia, reared in the Netherlands, were extracted

  8. Comparison of DNA extraction methods for detection of citrus huanglongbing in Colombia

    Directory of Open Access Journals (Sweden)

    Jorge Evelio Ángel

    2014-04-01

    Full Text Available Four DNA citrus plant tissue extraction protocols and three methods of DNA extraction from vector psyllid Diaphorina citri Kuwayama (Hemiptera: Psyllidae were compared as part of the validation process and standardization for detection of huanglongbing (HLB. The comparison was done using several criterias such as integrity, purity and concentration. The best quality parameters presented in terms of extraction of DNA from plant midribs tissue of citrus, were cited by Murray and Thompson (1980 and Rodríguez et al. (2010, while for the DNA extraction from psyllid vectors of HLB, the best extraction method was suggested by Manjunath et al.(2008.

  9. A new method for microwave assisted ethanolic extraction of Mentha rotundifolia bioactive terpenoids.

    Science.gov (United States)

    García-Sarrió, María Jesús; Sanz, María Luz; Sanz, Jesús; González-Coloma, Azucena; Cristina Soria, Ana

    2018-04-14

    A new microwave-assisted extraction (MAE) method using ethanol as solvent has been optimized by means of a Box-Behnken experimental design for the enhanced extraction of bioactive terpenoids from Mentha rotundifolia leaves; 100°C, 5 min, 1.125 g dry sample: 10 mL solvent and a single extraction cycle were selected as optimal conditions. Improved performance of MAE method in terms of extraction yield and/or reproducibility over conventional solid-liquid extraction and ultrasound assisted extraction was also previously assessed. A comprehensive characterization of MAE extracts was carried out by GC-MS. A total of 46 compounds, mostly terpenoids, were identified; piperitenone oxide and piperitenone were the major compounds determined. Several neophytadiene isomers were also detected for the first time in MAE extracts. Different procedures (solid-phase extraction and activated charcoal (AC) treatment) were also evaluated for clean-up of MAE extracts, with AC providing the highest enrichment in bioactive terpenoids. Finally, the MAE method here developed is shown as a green, fast, efficient and reproducible liquid extraction methodology to obtain M. rotundifolia bioactive extracts for further application, among others, as food preservatives. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Exact extraction method for road rutting laser lines

    Science.gov (United States)

    Hong, Zhiming

    2018-02-01

    This paper analyzes the importance of asphalt pavement rutting detection in pavement maintenance and pavement administration in today's society, the shortcomings of the existing rutting detection methods are presented and a new rutting line-laser extraction method based on peak intensity characteristic and peak continuity is proposed. The intensity of peak characteristic is enhanced by a designed transverse mean filter, and an intensity map of peak characteristic based on peak intensity calculation for the whole road image is obtained to determine the seed point of the rutting laser line. Regarding the seed point as the starting point, the light-points of a rutting line-laser are extracted based on the features of peak continuity, which providing exact basic data for subsequent calculation of pavement rutting depths.

  11. Analysis of the essential oils of Alpiniae Officinarum Hance in different extraction methods

    Science.gov (United States)

    Yuan, Y.; Lin, L. J.; Huang, X. B.; Li, J. H.

    2017-09-01

    It was developed for the analysis of the essential oils of Alpiniae Officinarum Hance extracted by steam distillation (SD), ultrasonic assisted solvent extraction (UAE) and supercritical fluid extraction (SFE) via gas chromatography mass spectrometry (GC-MS) combined with retention index (RI) method. There were multiple volatile components of the oils extracted by the three above-mention methods respectively identified; meanwhile, each one was quantified by area normalization method. The results indicated that the content of 1,8-Cineole, the index constituent, by SD was similar as SFE, and higher than UAE. Although UAE was less time consuming and consumed less energy, the oil quality was poorer due to the use of organic solvents was hard to degrade. In addition, some constituents could be obtained by SFE but could not by SD. In conclusion, essential oil of different extraction methods from the same batch of materials had been proved broadly similarly, however, there were some differences in composition and component ratio. Therefore, development and utilization of different extraction methods must be selected according to the functional requirements of products.

  12. Action Rules Mining

    CERN Document Server

    Dardzinska, Agnieszka

    2013-01-01

    We are surrounded by data, numerical, categorical and otherwise, which must to be analyzed and processed to convert it into information that instructs, answers or aids understanding and decision making. Data analysts in many disciplines such as business, education or medicine, are frequently asked to analyze new data sets which are often composed of numerous tables possessing different properties. They try to find completely new correlations between attributes and show new possibilities for users.   Action rules mining discusses some of data mining and knowledge discovery principles and then describe representative concepts, methods and algorithms connected with action. The author introduces the formal definition of action rule, notion of a simple association action rule and a representative action rule, the cost of association action rule, and gives a strategy how to construct simple association action rules of a lowest cost. A new approach for generating action rules from datasets with numerical attributes...

  13. The Effect of Homogenization Pressures on Extraction of Avocado Oil by Wet Method

    OpenAIRE

    Basuni Hamzah

    2013-01-01

    Avocado tree usually planted by people of Indonesia in rural are small in scale. Mostly, in the modern and high scale industry especially company has a large avocado farm the extraction of avocado oil is extracted through vacuum drying in low temperature. However, in rural area avocado tree spread out in small number of tree, so it needs alternative method of avocado oil extraction. In this experiment, wet method of avocado extraction was applied similar to traditional extraction of coconut o...

  14. The Effect of Sericin from Various Extraction Methods on Cell Viability and Collagen Production

    Directory of Open Access Journals (Sweden)

    Pornanong Aramwit

    2010-05-01

    Full Text Available Silk sericin (SS can accelerate cell proliferation and attachment; however, SS can be extracted by various methods, which result in SS exhibiting different physical and biological properties. We found that SS produced from various extraction methods has different molecular weights, zeta potential, particle size and amino acid content. The MTT assay indicated that SS from all extraction methods had no toxicity to mouse fibroblast cells at concentrations up to 40 μg/mL after 24 h incubation, but SS obtained from some extraction methods can be toxic at higher concentrations. Heat-degraded SS was the least toxic to cells and activated the highest collagen production, while urea-extracted SS showed the lowest cell viability and collagen production. SS from urea extraction was severely harmful to cells at concentrations higher than 100 μg/mL. SS from all extraction methods could still promote collagen production in a concentration-dependent manner, even at high concentrations that are toxic to cells.

  15. Coconut oil extraction by the traditional Java method : An investigation of its potential application in aqueous Jatropha oil extraction

    NARCIS (Netherlands)

    Marasabessy, Ahmad; Moeis, Maelita R.; Sanders, Johan P. M.; Weusthuis, Ruud A.

    A traditional Java method of coconut oil extraction assisted by paddy crabs was investigated to find out if crabs or crab-derived components can be used to extract oil from Jatropha curcas seed kernels. Using the traditional Java method the addition of crab paste liberated 54% w w(-1) oil from

  16. A rule based method for context sensitive threshold segmentation in SPECT using simulation

    International Nuclear Information System (INIS)

    Fleming, John S.; Alaamer, Abdulaziz S.

    1998-01-01

    Robust techniques for automatic or semi-automatic segmentation of objects in single photon emission computed tomography (SPECT) are still the subject of development. This paper describes a threshold based method which uses empirical rules derived from analysis of computer simulated images of a large number of objects. The use of simulation allowed the factors affecting the threshold which correctly segmented objects to be investigated systematically. Rules could then be derived from these data to define the threshold in any particular context. The technique operated iteratively and calculated local context sensitive thresholds along radial profiles from the centre of gravity of the object. It was evaluated in a further series of simulated objects and in human studies, and compared to the use of a global fixed threshold. The method was capable of improving accuracy of segmentation and volume assessment compared to the global threshold technique. The improvements were greater for small volumes, shapes with large surface area to volume ratio, variable surrounding activity and non-uniform distributions. The method was applied successfully to simulated objects and human studies and is considered to be a significant advance on global fixed threshold techniques. (author)

  17. An Effective Fault Feature Extraction Method for Gas Turbine Generator System Diagnosis

    Directory of Open Access Journals (Sweden)

    Jian-Hua Zhong

    2016-01-01

    Full Text Available Fault diagnosis is very important to maintain the operation of a gas turbine generator system (GTGS in power plants, where any abnormal situations will interrupt the electricity supply. The fault diagnosis of the GTGS faces the main challenge that the acquired data, vibration or sound signals, contain a great deal of redundant information which extends the fault identification time and degrades the diagnostic accuracy. To improve the diagnostic performance in the GTGS, an effective fault feature extraction framework is proposed to solve the problem of the signal disorder and redundant information in the acquired signal. The proposed framework combines feature extraction with a general machine learning method, support vector machine (SVM, to implement an intelligent fault diagnosis. The feature extraction method adopts wavelet packet transform and time-domain statistical features to extract the features of faults from the vibration signal. To further reduce the redundant information in extracted features, kernel principal component analysis is applied in this study. Experimental results indicate that the proposed feature extracted technique is an effective method to extract the useful features of faults, resulting in improvement of the performance of fault diagnosis for the GTGS.

  18. Improving ELM-Based Service Quality Prediction by Concise Feature Extraction

    Directory of Open Access Journals (Sweden)

    Yuhai Zhao

    2015-01-01

    Full Text Available Web services often run on highly dynamic and changing environments, which generate huge volumes of data. Thus, it is impractical to monitor the change of every QoS parameter for the timely trigger precaution due to high computational costs associated with the process. To address the problem, this paper proposes an active service quality prediction method based on extreme learning machine. First, we extract web service trace logs and QoS information from the service log and convert them into feature vectors. Second, by the proposed EC rules, we are enabled to trigger the precaution of QoS as soon as possible with high confidence. An efficient prefix tree based mining algorithm together with some effective pruning rules is developed to mine such rules. Finally, we study how to extract a set of diversified features as the representative of all mined results. The problem is proved to be NP-hard. A greedy algorithm is presented to approximate the optimal solution. Experimental results show that ELM trained by the selected feature subsets can efficiently improve the reliability and the earliness of service quality prediction.

  19. Accelerated solvent extraction method with one-step clean-up for hydrocarbons in soil

    International Nuclear Information System (INIS)

    Nurul Huda Mamat Ghani; Norashikin Sain; Rozita Osman; Zuraidah Abdullah Munir

    2007-01-01

    The application of accelerated solvent extraction (ASE) using hexane combined with neutral silica gel and sulfuric acid/ silica gel (SA/ SG) to remove impurities prior to analysis by gas chromatograph with flame ionization detector (GC-FID) was studied. The efficiency of extraction was evaluated based on the three hydrocarbons; dodecane, tetradecane and pentadecane spiked to soil sample. The effect of ASE operating conditions (extraction temperature, extraction pressure, static time) was evaluated and the optimized condition obtained from the study was extraction temperature of 160 degree Celsius, extraction pressure of 2000 psi with 5 minutes static extraction time. The developed ASE with one-step clean-up method was applied in the extraction of hydrocarbons from spiked soil and the amount extracted was comparable to ASE extraction without clean-up step with the advantage of obtaining cleaner extract with reduced interferences. Therefore in the developed method, extraction and clean-up for hydrocarbons in soil can be achieved rapidly and efficiently with reduced solvent usage. (author)

  20. Advanced Extraction Methods for Actinide/Lanthanide Separations

    International Nuclear Information System (INIS)

    Scott, M.J.

    2005-01-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  1. Advanced Extraction Methods for Actinide/Lanthanide Separations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, M.J.

    2005-12-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  2. Paper Improving Rule Based Stemmers to Solve Some Special Cases of Arabic Language

    Directory of Open Access Journals (Sweden)

    Soufiane Farrah

    2017-04-01

    Full Text Available Analysis of Arabic language has become a necessity because of its big evolution; we propose in this paper a rule based extraction method of Arabic text to solve some weaknesses founded on previous research works. Our approach is divided on preprocessing phase, on which we proceed to the tokenization of the text, and formatting it by removing any punctuation, diacritics and non-letter characters. Treatment phase based on the elimination of several sets of affixes (diacritics, prefixes, and suffixes, and on the application of several patterns. A check phase that verifies if the root extracted is correct, by searching the result in root dictionaries.

  3. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  4. Comparison of protein extraction methods suitable for proteomics ...

    African Journals Online (AJOL)

    Jane

    2011-07-27

    Jul 27, 2011 ... An efficient protein extraction method is a prerequisite for successful implementation of proteomics. ... research, it is noteworthy to discover a proteome ..... Proteomic analysis of rice (Oryza sativa) seeds during germination.

  5. The influence of extraction methods on composition and antioxidant properties of rice bran oil

    Directory of Open Access Journals (Sweden)

    Noppawat Pengkumsri

    2015-09-01

    Full Text Available AbstractThe current study was employed to assess the influence of the different extraction methods on total tocols, γ-oryzanol content, and antioxidant properties of Chiang Mai Black rice, Mali Red rice, and Suphanburi-1 Brown rice bran oil. Rice bran oil (RBO was extracted by Hexane, Hot pressed, Cold pressed, and Supercritical Fluid Extraction (SFe methods. High yield of RBO was extracted by hexane and SFe methods. Total and subgroups of tocols, and γ-oryzanol content were determined by HPLC. The hexane extracted sample accounts for high content of γ-oryzanol and tocols. Besides, all of RBO extracts contain a significantly high amount of γ-tocotrienol. In vitro antioxidant assay results indicated that superior quality of oil was recovered by hexane extraction. The temperature in the extraction process also affects the value of the oil. Superior quality of oil was recovered by hexane extraction, in terms of phytochemical contents and antioxidant properties compared to other tested extraction methods. Further, thorough study of factors compromising the quality and quantity of RBO recovery is required for the development of enhanced functional foods and other related products.

  6. A comparative study: the impact of different lipid extraction methods on current microalgal lipid research

    Science.gov (United States)

    2014-01-01

    Microalgae cells have the potential to rapidly accumulate lipids, such as triacylglycerides that contain fatty acids important for high value fatty acids (e.g., EPA and DHA) and/or biodiesel production. However, lipid extraction methods for microalgae cells are not well established, and there is currently no standard extraction method for the determination of the fatty acid content of microalgae. This has caused a few problems in microlagal biofuel research due to the bias derived from different extraction methods. Therefore, this study used several extraction methods for fatty acid analysis on marine microalga Tetraselmis sp. M8, aiming to assess the potential impact of different extractions on current microalgal lipid research. These methods included classical Bligh & Dyer lipid extraction, two other chemical extractions using different solvents and sonication, direct saponification and supercritical CO2 extraction. Soxhlet-based extraction was used to weigh out the importance of solvent polarity in the algal oil extraction. Coupled with GC/MS, a Thermogravimetric Analyser was used to improve the quantification of microalgal lipid extractions. Among these extractions, significant differences were observed in both, extract yield and fatty acid composition. The supercritical extraction technique stood out most for effective extraction of microalgal lipids, especially for long chain unsaturated fatty acids. The results highlight the necessity for comparative analyses of microalgae fatty acids and careful choice and validation of analytical methodology in microalgal lipid research. PMID:24456581

  7. Lipid Extraction Methods from Microalgae: A Comprehensive Review

    Energy Technology Data Exchange (ETDEWEB)

    Ranjith Kumar, Ramanathan [Department of Plant Biology and Plant Biotechnology, Shree Chandraprabhu Jain College, Chennai (India); Hanumantha Rao, Polur [Department of Microbiology, Madras Christian College, Chennai (India); Arumugam, Muthu, E-mail: arumugam@niist.res.in [Division of Biotechnology, CSIR – National Institute for Interdisciplinary Science and Technology (NIIST), Trivandrum (India)

    2015-01-08

    Energy security has become a serious global issue and a lot of research is being carried out to look for economically viable and environment-friendly alternatives. The only solution that appears to meet futuristic needs is the use of renewable energy. Although various forms of renewable energy are being currently used, the prospects of producing carbon-neutral biofuels from microalgae appear bright because of their unique features such as suitability of growing in open ponds required for production of a commodity product, high CO{sub 2}-sequestering capability, and ability to grow in wastewater/seawater/brackish water and high-lipid productivity. The major process constraint in microalgal biofuel technology is the cost-effective and efficient extraction of lipids. The objective of this article is to provide a comprehensive review on various methods of lipid extraction from microalgae available, to date, as well as to discuss their advantages and disadvantages. The article covers all areas of lipid extraction procedures including solvent extraction procedures, mechanical approaches, and solvent-free procedures apart from some of the latest extraction technologies. Further research is required in this area for successful implementation of this technology at the production scale.

  8. Lipid Extraction Methods from Microalgae: A Comprehensive Review

    International Nuclear Information System (INIS)

    Ranjith Kumar, Ramanathan; Hanumantha Rao, Polur; Arumugam, Muthu

    2015-01-01

    Energy security has become a serious global issue and a lot of research is being carried out to look for economically viable and environment-friendly alternatives. The only solution that appears to meet futuristic needs is the use of renewable energy. Although various forms of renewable energy are being currently used, the prospects of producing carbon-neutral biofuels from microalgae appear bright because of their unique features such as suitability of growing in open ponds required for production of a commodity product, high CO 2 -sequestering capability, and ability to grow in wastewater/seawater/brackish water and high-lipid productivity. The major process constraint in microalgal biofuel technology is the cost-effective and efficient extraction of lipids. The objective of this article is to provide a comprehensive review on various methods of lipid extraction from microalgae available, to date, as well as to discuss their advantages and disadvantages. The article covers all areas of lipid extraction procedures including solvent extraction procedures, mechanical approaches, and solvent-free procedures apart from some of the latest extraction technologies. Further research is required in this area for successful implementation of this technology at the production scale.

  9. Application of enzymatic methods for chia (Salvia hispanica L oil extraction

    Directory of Open Access Journals (Sweden)

    Norma Ciau-Solís

    2016-07-01

    Full Text Available Aim. The aim was to evaluate the use of different enzymatic treatments on the oil extraction yield from Chia (Salvia hispanica L. seeds Methods. Enzymatic extraction was performed by treating of whole and degummed chia flours with different conditions of enzyme concentration, pH and temperature. Commercial enzymes were employed: Viscozyme LTM (endo-1,3 (4-betaglucanase derived from Aspergillus aculeatus, with 100 FBG g (Beta Glucanase-unit Fungal and Neutrase0.8LTM, neutral protease with 0.8 AU-NH/g of activity, derived from Bacillus amyloliquefaciens. Results. All treatments of enzymatic oil extraction were different (P <0.05 and the maximum oil yield obtained was 9.35%. Conclusion. Oil extraction using enzymatic methods is not a viable for chia seed

  10. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery

    Directory of Open Access Journals (Sweden)

    Fasahat Ullah Siddiqui

    2016-07-01

    Full Text Available Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality. Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state

  11. Integrated Phoneme Subspace Method for Speech Feature Extraction

    Directory of Open Access Journals (Sweden)

    Park Hyunsin

    2009-01-01

    Full Text Available Speech feature extraction has been a key focus in robust speech recognition research. In this work, we discuss data-driven linear feature transformations applied to feature vectors in the logarithmic mel-frequency filter bank domain. Transformations are based on principal component analysis (PCA, independent component analysis (ICA, and linear discriminant analysis (LDA. Furthermore, this paper introduces a new feature extraction technique that collects the correlation information among phoneme subspaces and reconstructs feature space for representing phonemic information efficiently. The proposed speech feature vector is generated by projecting an observed vector onto an integrated phoneme subspace (IPS based on PCA or ICA. The performance of the new feature was evaluated for isolated word speech recognition. The proposed method provided higher recognition accuracy than conventional methods in clean and reverberant environments.

  12. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  13. Extractive text summarization system to aid data extraction from full text in systematic review development.

    Science.gov (United States)

    Bui, Duy Duc An; Del Fiol, Guilherme; Hurdle, John F; Jonnalagadda, Siddhartha

    2016-12-01

    Extracting data from publication reports is a standard process in systematic review (SR) development. However, the data extraction process still relies too much on manual effort which is slow, costly, and subject to human error. In this study, we developed a text summarization system aimed at enhancing productivity and reducing errors in the traditional data extraction process. We developed a computer system that used machine learning and natural language processing approaches to automatically generate summaries of full-text scientific publications. The summaries at the sentence and fragment levels were evaluated in finding common clinical SR data elements such as sample size, group size, and PICO values. We compared the computer-generated summaries with human written summaries (title and abstract) in terms of the presence of necessary information for the data extraction as presented in the Cochrane review's study characteristics tables. At the sentence level, the computer-generated summaries covered more information than humans do for systematic reviews (recall 91.2% vs. 83.8%, p<0.001). They also had a better density of relevant sentences (precision 59% vs. 39%, p<0.001). At the fragment level, the ensemble approach combining rule-based, concept mapping, and dictionary-based methods performed better than individual methods alone, achieving an 84.7% F-measure. Computer-generated summaries are potential alternative information sources for data extraction in systematic review development. Machine learning and natural language processing are promising approaches to the development of such an extractive summarization system. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Text extraction method for historical Tibetan document images based on block projections

    Science.gov (United States)

    Duan, Li-juan; Zhang, Xi-qun; Ma, Long-long; Wu, Jian

    2017-11-01

    Text extraction is an important initial step in digitizing the historical documents. In this paper, we present a text extraction method for historical Tibetan document images based on block projections. The task of text extraction is considered as text area detection and location problem. The images are divided equally into blocks and the blocks are filtered by the information of the categories of connected components and corner point density. By analyzing the filtered blocks' projections, the approximate text areas can be located, and the text regions are extracted. Experiments on the dataset of historical Tibetan documents demonstrate the effectiveness of the proposed method.

  15. Compton scattering from nuclei and photo-absorption sum rules

    International Nuclear Information System (INIS)

    Gorchtein, Mikhail; Hobbs, Timothy; Londergan, J. Timothy; Szczepaniak, Adam P.

    2011-01-01

    We revisit the photo-absorption sum rule for real Compton scattering from the proton and from nuclear targets. In analogy with the Thomas-Reiche-Kuhn sum rule appropriate at low energies, we propose a new 'constituent quark model' sum rule that relates the integrated strength of hadronic resonances to the scattering amplitude on constituent quarks. We study the constituent quark model sum rule for several nuclear targets. In addition, we extract the α=0 pole contribution for both proton and nuclei. Using the modern high-energy proton data, we find that the α=0 pole contribution differs significantly from the Thomson term, in contrast with the original findings by Damashek and Gilman.

  16. Evaluation of extraction methods for hexavalent chromium determination in dusts, ashes, and soils

    Science.gov (United States)

    Wolf, Ruth E.; Wilson, Stephen A.

    2010-01-01

    One of the difficulties in performing speciation analyses on solid samples is finding a suitable extraction method. Traditional methods for extraction of hexavalent chromium, Cr(VI), in soils, such as SW846 Method 3060A, can be tedious and are not always compatible with some determination methods. For example, the phosphate and high levels of carbonate and magnesium present in the U.S. Environmental Protection Agency (USEPA) Method 3060A digestion for Cr(VI) were found to be incompatible with the High Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometry (HPLC-ICP-MS) detection method used by our laboratory. Modification of Method 3060A by eliminating the use of the phosphate buffer provided improved performance with the detection method, however dilutions are still necessary to achieve good chromatographic separation and detection of Cr(VI). An ultrasonic extraction method using a 1 mM Na2CO3 - 9 mM NaHCO3 buffer solution, adapted from Occupational Safety and Health Administration (OSHA) Method ID215, has been used with good results for the determination of Cr(VI) in air filters. The average recovery obtained for BCR-545 - Welding Dust Loaded on Filter (IRMM, Belgium) using this method was 99 percent (1.2 percent relative standard deviation) with no conversion of Cr(VI) to Cr(III) during the extraction process. This ultrasonic method has the potential for use with other sample matrices, such as ashes and soils. Preliminary investigations using NIST 2701 (Hexavalent Chromium in Contaminated Soil) loaded onto quartz filters showed promising results with approximately 90 percent recovery of the certified Cr(VI) value. Additional testing has been done using NIST 2701 and NIST 2700 using different presentation methods. Extraction efficiency of bulk presentation, where small portions of the sample are added to the bottom of the extraction vessel, will be compared with supported presentation, where small portions of the sample are loaded onto a

  17. Improvement of seawater salt quality by hydro-extraction and re-crystallization methods

    Science.gov (United States)

    Sumada, K.; Dewati, R.; Suprihatin

    2018-01-01

    Indonesia is one of the salt producing countries that use sea water as a source of raw materials, the quality of salt produced is influenced by the quality of sea water. The resulting average salt quality contains 85-90% NaCl. The Indonesian National Standard (SNI) for human salt’s consumption sodium chloride content is 94.7 % (dry base) and for industrial salt 98,5 %. In this study developed the re-crystallization without chemical and hydro-extraction method. The objective of this research to choose the best methods based on efficiency. The results showed that re-crystallization method can produce salt with NaCl content 99,21%, while hydro-extraction method content 99,34 % NaCl. The salt produced through both methods can be used as a consumption and industrial salt, Hydro-extraction method is more efficient than re-crystallization method because re-crystallization method requires heat energy.

  18. A new method for the extraction of Au(III) with ethyl thioacetoacetate into chloroform

    International Nuclear Information System (INIS)

    Khan, S.Z.; Turel, Z.R.

    1985-01-01

    A method was developed for rapid and selective extraction of Au(III) with ethyl thioacetoacetate (HETAcAc) into chloroform at pH 4. The effect of various parameters on the extraction coefficient values were studied. The substoichiometry of the extracted species of 1:3 was obtained by slope ratio and substoichiometric extraction method, respectively. (author)

  19. Analysis of medicinal plant extracts by neutron activation method

    International Nuclear Information System (INIS)

    Vaz, Sandra Muntz

    1995-01-01

    This dissertation has presented the results from analysis of medicinal plant extracts using neutron activation method. Instrumental neutron activation analysis was applied to the determination of the elements Al, Br, Ca, Ce, Cl, Cr, Cs, Fe, K, La, Mg, Mn, Na, Rb, Sb, Sc and Zn in medicinal extracts obtained from Achyrolcline satureoides DC, Casearia sylvestris, Centella asiatica, Citrus aurantium L., Solano lycocarpum, Solidago microglossa, Stryphnondedron barbatiman and Zingiber officinale R. plants. The elements Hg and Se were determined using radiochemical separation by means of retention of Se in HMD inorganic exchanger and solvent extraction of Hg by bismuth diethyl-dithiocarbamate solution. Precision and accuracy of the results have been evaluated by analysing reference materials. The therapeutic action of some elements found in plant extracts analyzed was briefly discussed

  20. A validated solid-liquid extraction method for the HPLC determination of polyphenols in apple tissues Comparison with pressurised liquid extraction.

    Science.gov (United States)

    Alonso-Salces, Rosa M; Barranco, Alejandro; Corta, Edurne; Berrueta, Luis A; Gallo, Blanca; Vicente, Francisca

    2005-02-15

    A solid-liquid extraction procedure followed by reversed-phase high-performance liquid chromatography (RP-HPLC) coupled with a photodiode array detector (DAD) for the determination of polyphenols in freeze-dried apple peel and pulp is reported. The extraction step consists in sonicating 0.5g of freeze-dried apple tissue with 30mL of methanol-water-acetic acid (30:69:1, v/v/v) containing 2g of ascorbic acid/L, for 10min in an ultrasonic bath. The whole method was validated, concluding that it is a robust method that presents high extraction efficiencies (peel: >91%, pulp: >95%) and appropriate precisions (within day: R.S.D. (n = 5) <5%, and between days: R.S.D. (n = 5) <7%) at the different concentration levels of polyphenols that can be found in apple samples. The method was compared with one previously published, consisting in a pressurized liquid extraction (PLE) followed by RP-HPLC-DAD determination. The advantages and disadvantages of both methods are discussed.

  1. Change-Point Detection Method for Clinical Decision Support System Rule Monitoring.

    Science.gov (United States)

    Liu, Siqi; Wright, Adam; Hauskrecht, Milos

    2017-06-01

    A clinical decision support system (CDSS) and its components can malfunction due to various reasons. Monitoring the system and detecting its malfunctions can help one to avoid any potential mistakes and associated costs. In this paper, we investigate the problem of detecting changes in the CDSS operation, in particular its monitoring and alerting subsystem, by monitoring its rule firing counts. The detection should be performed online, that is whenever a new datum arrives, we want to have a score indicating how likely there is a change in the system. We develop a new method based on Seasonal-Trend decomposition and likelihood ratio statistics to detect the changes. Experiments on real and simulated data show that our method has a lower delay in detection compared with existing change-point detection methods.

  2. Evolving rule-based systems in two medical domains using genetic programming.

    Science.gov (United States)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  3. Antifouling booster biocide extraction from marine sediments: a fast and simple method based on vortex-assisted matrix solid-phase extraction.

    Science.gov (United States)

    Caldas, Sergiane Souza; Soares, Bruno Meira; Abreu, Fiamma; Castro, Ítalo Braga; Fillmann, Gilberto; Primel, Ednei Gilberto

    2018-03-01

    This paper reports the development of an analytical method employing vortex-assisted matrix solid-phase dispersion (MSPD) for the extraction of diuron, Irgarol 1051, TCMTB (2-thiocyanomethylthiobenzothiazole), DCOIT (4,5-dichloro-2-n-octyl-3-(2H)-isothiazolin-3-one), and dichlofluanid from sediment samples. Separation and determination were performed by liquid chromatography tandem-mass spectrometry. Important MSPD parameters, such as sample mass, mass of C18, and type and volume of extraction solvent, were investigated by response surface methodology. Quantitative recoveries were obtained with 2.0 g of sediment sample, 0.25 g of C18 as the solid support, and 10 mL of methanol as the extraction solvent. The MSPD method was suitable for the extraction and determination of antifouling biocides in sediment samples, with recoveries between 61 and 103% and a relative standard deviation lower than 19%. Limits of quantification between 0.5 and 5 ng g -1 were obtained. Vortex-assisted MPSD was shown to be fast and easy to use, with the advantages of low cost and reduced solvent consumption compared to the commonly employed techniques for the extraction of booster biocides from sediment samples. Finally, the developed method was applied to real samples. Results revealed that the developed extraction method is effective and simple, thus allowing the determination of biocides in sediment samples.

  4. Establishment of a New Drug Code for Marihuana Extract. Final rule.

    Science.gov (United States)

    2016-12-14

    The Drug Enforcement Administration is creating a new Administration Controlled Substances Code Number for "Marihuana Extract." This code number will allow DEA and DEA-registered entities to track quantities of this material separately from quantities of marihuana. This, in turn, will aid in complying with relevant treaty provisions. Under international drug control treaties administered by the United Nations, some differences exist between the regulatory controls pertaining to marihuana extract versus those for marihuana and tetrahydrocannabinols. The DEA has previously established separate code numbers for marihuana and for tetrahydrocannabinols, but not for marihuana extract. To better track these materials and comply with treaty provisions, DEA is creating a separate code number for marihuana extract with the following definition: "Meaning an extract containing one or more cannabinoids that has been derived from any plant of the genus Cannabis, other than the separated resin (whether crude or purified) obtained from the plant." Extracts of marihuana will continue to be treated as Schedule I controlled substances.

  5. RULE-BASE METHOD FOR ANALYSIS OF QUALITY E-LEARNING IN HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    darsih darsih darsih

    2016-04-01

    Full Text Available ABSTRACT Assessing the quality of e-learning courses to measure the success of e-learning systems in online learning is essential. The system can be used to improve education. The study analyzes the quality of e-learning course on the web site www.kulon.undip.ac.id used a questionnaire with questions based on the variables of ISO 9126. Penilaiann Likert scale was used with a web app. Rule-base reasoning method is used to subject the quality of e-learningyang assessed. A case study conducted in four e-learning courses with 133 sample / respondents as users of the e-learning course. From the obtained results of research conducted both for the value of e-learning from each subject tested. In addition, each e-learning courses have different advantages depending on certain variables. Keywords : E-Learning, Rule-Base, Questionnaire, Likert, Measuring.

  6. Deep Learning Methods for Underwater Target Feature Extraction and Recognition

    Directory of Open Access Journals (Sweden)

    Gang Hu

    2018-01-01

    Full Text Available The classification and recognition technology of underwater acoustic signal were always an important research content in the field of underwater acoustic signal processing. Currently, wavelet transform, Hilbert-Huang transform, and Mel frequency cepstral coefficients are used as a method of underwater acoustic signal feature extraction. In this paper, a method for feature extraction and identification of underwater noise data based on CNN and ELM is proposed. An automatic feature extraction method of underwater acoustic signals is proposed using depth convolution network. An underwater target recognition classifier is based on extreme learning machine. Although convolution neural networks can execute both feature extraction and classification, their function mainly relies on a full connection layer, which is trained by gradient descent-based; the generalization ability is limited and suboptimal, so an extreme learning machine (ELM was used in classification stage. Firstly, CNN learns deep and robust features, followed by the removing of the fully connected layers. Then ELM fed with the CNN features is used as the classifier to conduct an excellent classification. Experiments on the actual data set of civil ships obtained 93.04% recognition rate; compared to the traditional Mel frequency cepstral coefficients and Hilbert-Huang feature, recognition rate greatly improved.

  7. Current lipid extraction methods are significantly enhanced adding a water treatment step in Chlorella protothecoides.

    Science.gov (United States)

    Ren, Xiaojie; Zhao, Xinhe; Turcotte, François; Deschênes, Jean-Sébastien; Tremblay, Réjean; Jolicoeur, Mario

    2017-02-11

    Microalgae have the potential to rapidly accumulate lipids of high interest for the food, cosmetics, pharmaceutical and energy (e.g. biodiesel) industries. However, current lipid extraction methods show efficiency limitation and until now, extraction protocols have not been fully optimized for specific lipid compounds. The present study thus presents a novel lipid extraction method, consisting in the addition of a water treatment of biomass between the two-stage solvent extraction steps of current extraction methods. The resulting modified method not only enhances lipid extraction efficiency, but also yields a higher triacylglycerols (TAG) ratio, which is highly desirable for biodiesel production. Modification of four existing methods using acetone, chloroform/methanol (Chl/Met), chloroform/methanol/H 2 O (Chl/Met/H 2 O) and dichloromethane/methanol (Dic/Met) showed respective lipid extraction yield enhancement of 72.3, 35.8, 60.3 and 60.9%. The modified acetone method resulted in the highest extraction yield, with 68.9 ± 0.2% DW total lipids. Extraction of TAG was particularly improved with the water treatment, especially for the Chl/Met/H 2 O and Dic/Met methods. The acetone method with the water treatment led to the highest extraction level of TAG with 73.7 ± 7.3 µg/mg DW, which is 130.8 ± 10.6% higher than the maximum value obtained for the four classical methods (31.9 ± 4.6 µg/mg DW). Interestingly, the water treatment preferentially improved the extraction of intracellular fractions, i.e. TAG, sterols, and free fatty acids, compared to the lipid fractions of the cell membranes, which are constituted of phospholipids (PL), acetone mobile polar lipids and hydrocarbons. Finally, from the 32 fatty acids analyzed for both neutral lipids (NL) and polar lipids (PL) fractions, it is clear that the water treatment greatly improves NL-to-PL ratio for the four standard methods assessed. Water treatment of biomass after the first solvent extraction step

  8. Effects of Extraction Methods on Phytochemicals of Rice Bran Oils Produced from Colored Rice.

    Science.gov (United States)

    Mingyai, Sukanya; Srikaeo, Khongsak; Kettawan, Aikkarach; Singanusong, Riantong; Nakagawa, Kiyotaka; Kimura, Fumiko; Ito, Junya

    2018-02-01

    Rice bran oil (RBO) especially from colored rice is rich in phytochemicals and has become popular in food, cosmetic, nutraceutical and pharmaceutical applications owing to its offering health benefits. This study determined the contents of phytochemicals including oryzanols, phytosterols, tocopherols (Toc) and tocotrienols (T3) in RBOs extracted using different methods namely cold-press extraction (CPE), solvent extraction (SE) and supercritical CO 2 extraction (SC-CO 2 ). Two colored rice, Red Jasmine rice (RJM, red rice) and Hom-nin rice (HN, black rice), were studied in comparison with the popular Thai fragrant rice Khao Dawk Mali 105 (KDML 105, white rice). RBOs were found to be the rich source of oryzanols, phytosterols, Toc and T3. Rice varieties had a greater effect on the phytochemicals concentrations than extraction methods. HN rice showed the significantly highest concentration of all phytochemicals, followed by RJM and KDML 105 rice, indicating that colored rice contained high concentration of phytochemicals in the oil than non-colored rice. The RBO samples extracted by the CPE method had a greater concentration of the phytochemicals than those extracted by the SC-CO 2 and SE methods, respectively. In terms of phytochemical contents, HN rice extracted using CPE method was found to be the best.

  9. Effect of extraction method on the yield of furanocoumarins from fruits of Archangelica officinalis Hoffm.

    Science.gov (United States)

    Waksmundzka-Hajnos, M; Petruczynik, A; Dragan, A; Wianowska, D; Dawidowicz, A L

    2004-01-01

    Optimal conditions for the extraction and analysis of furanocoumarins from fruits of Archangelica officinalis Hoffm. have been determined. The following extraction methods were used: exhaustive extraction in a Soxhlet apparatus, ultrasonication at 25 and 60 degrees C, microwave-assisted solvent extraction in open and closed systems, and accelerated solvent extraction (ASE). In most cases the yields of furanocoumarins were highest using the ASE method. The effects of extracting solvent, temperature and time of extraction using this method were investigated. The highest yield of furanocoumarins by ASE was obtained with methanol at 100-130 degrees C for 10 min. The extraction yields of furanocoumarins from plant material by ultrasonication at 60 degrees C and microwave-assisted solvent extraction in an open system were comparable to the extraction yields obtained in the time- and solvent-consuming exhaustive process involving the Soxhlet apparatus.

  10. Comparison of two methods for extraction of volatiles from marine PL emulsions

    DEFF Research Database (Denmark)

    Lu, Henna Fung Sieng; Nielsen, Nina Skall; Jacobsen, Charlotte

    2013-01-01

    The dynamic headspace (DHS) thermal desorption principle using Tenax GR tube, as well as the solid phase micro‐extraction (SPME) tool with carboxen/polydimethylsiloxane 50/30 µm CAR/PDMS SPME fiber, both coupled to GC/MS were implemented for the isolation and identification of both lipid...... and Strecker derived volatiles in marine phospholipids (PL) emulsions. Comparison of volatile extraction efficiency was made between the methods. For marine PL emulsions with a highly complex composition of volatiles headspace, a fiber saturation problem was encountered when using CAR/PDMS‐SPME for volatiles...... analysis. However, the CAR/PDMS‐SPME technique was efficient for lipid oxidation analysis in emulsions of less complex headspace. The SPME method extracted volatiles of lower molecular weights more efficient than the DHS method. On the other hand, DHS Tenax GR appeared to be more efficient in extracting...

  11. A Financial Data Mining Model for Extracting Customer Behavior

    Directory of Open Access Journals (Sweden)

    Mark K.Y. Mak

    2011-08-01

    Full Text Available Facing the problem of variation and chaotic behavior of customers, the lack of sufficient information is a challenge to many business organizations. Human analysts lacking an understanding of the hidden patterns in business data, thus, can miss corporate business opportunities. In order to embrace all business opportunities, enhance the competitiveness, discovery of hidden knowledge, unexpected patterns and useful rules from large databases have provided a feasible solution for several decades. While there is a wide range of financial analysis products existing in the financial market, how to customize the investment portfolio for the customer is still a challenge to many financial institutions. This paper aims at developing an intelligent Financial Data Mining Model (FDMM for extracting customer behavior in the financial industry, so as to increase the availability of decision support data and hence increase customer satisfaction. The proposed financial model first clusters the customers into several sectors, and then finds the correlation among these sectors. It is noted that better customer segmentation can increase the ability to identify targeted customers, therefore extracting useful rules for specific clusters can provide an insight into customers' buying behavior and marketing implications. To validate the feasibility of the proposed model, a simple dataset is collected from a financial company in Hong Kong. The simulation experiments show that the proposed method not only can improve the workflow of a financial company, but also deepen understanding of investment behavior. Thus, a corporation is able to customize the most suitable products and services for customers on the basis of the rules extracted.

  12. Hierarchical extraction of urban objects from mobile laser scanning data

    Science.gov (United States)

    Yang, Bisheng; Dong, Zhen; Zhao, Gang; Dai, Wenxia

    2015-01-01

    Point clouds collected in urban scenes contain a huge number of points (e.g., billions), numerous objects with significant size variability, complex and incomplete structures, and variable point densities, raising great challenges for the automated extraction of urban objects in the field of photogrammetry, computer vision, and robotics. This paper addresses these challenges by proposing an automated method to extract urban objects robustly and efficiently. The proposed method generates multi-scale supervoxels from 3D point clouds using the point attributes (e.g., colors, intensities) and spatial distances between points, and then segments the supervoxels rather than individual points by combining graph based segmentation with multiple cues (e.g., principal direction, colors) of the supervoxels. The proposed method defines a set of rules for merging segments into meaningful units according to types of urban objects and forms the semantic knowledge of urban objects for the classification of objects. Finally, the proposed method extracts and classifies urban objects in a hierarchical order ranked by the saliency of the segments. Experiments show that the proposed method is efficient and robust for extracting buildings, streetlamps, trees, telegraph poles, traffic signs, cars, and enclosures from mobile laser scanning (MLS) point clouds, with an overall accuracy of 92.3%.

  13. Comparison of RNA Extraction Methods for the Identification of Grapevine fan leaf virus

    Directory of Open Access Journals (Sweden)

    Z. Gholampour

    2016-06-01

    Full Text Available Introduction: To now, more than 70 viral diseases have been reported from grapevine. Serological methods are regular diagnostic tools of grapevine viruses, however, their sensitivity has affected by seasonal fluctuations of the virus. Reverse transcription polymerase chain reaction provides significant improvement in detection of grapevine viruses. Extraction of high-quality RNA is essential for the successful application of many molecular techniques, such as RT-PCR. Extraction of high-quality RNA from the leaves of woody plants, such as grapevine, is particularly challenging because of high concentrations of polysaccharides, polyphenols, and other secondary metabolites. Some RNA extraction methods yield pellets that are poorly soluble, indicating the presence of unknown contaminants, whereas others are gelatinous, indicating the presence of polysaccharides. RNA can make complexes with polysaccharides and phenolic compounds render the RNA unusable for applications such as reverse transcription. Grapevine fanleaf virus is a member of the genus Nepovirus in the family Secoviridae. The GFLV genome consists of two positive-sense single stranded RNAs. The genome has a poly (A tail at the 3´ terminus and a covalently linked VPG protein at the 5´ terminus. Several extraction methods had been reported to be used for identification of GFLV in grapevine. Some of them require harmful chemical material; disadvantages of other are high costs. Immunocapture-RT-PCR requires preparation of specific antibody and direct binding RT-PCR (DB-RT-PCR has a high contamination risk. In this study, four RNA extraction protocols were compared with a commercial isolation kit to explore the most efficient RNA isolation method for grapevines. Material and Methods: 40 leaf samples were randomly collected during the growing season of 2011-2012. GFLV was detected in leaf samples by enzyme linked immunosorbent assay (ELISA Using specific antibodies raised against Iranian

  14. Rule-guided human classification of Volunteered Geographic Information

    Science.gov (United States)

    Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian

    2017-05-01

    During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass

  15. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  16. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  17. Discovering H-bonding rules in crystals with inductive logic programming.

    Science.gov (United States)

    Ando, Howard Y; Dehaspe, Luc; Luyten, Walter; Van Craenenbroeck, Elke; Vandecasteele, Henk; Van Meervelt, Luc

    2006-01-01

    In the domain of crystal engineering, various schemes have been proposed for the classification of hydrogen bonding (H-bonding) patterns observed in 3D crystal structures. In this study, the aim is to complement these schemes with rules that predict H-bonding in crystals from 2D structural information only. Modern computational power and the advances in inductive logic programming (ILP) can now provide computational chemistry with the opportunity for extracting structure-specific rules from large databases that can be incorporated into expert systems. ILP technology is here applied to H-bonding in crystals to develop a self-extracting expert system utilizing data in the Cambridge Structural Database of small molecule crystal structures. A clear increase in performance was observed when the ILP system DMax was allowed to refer to the local structural environment of the possible H-bond donor/acceptor pairs. This ability distinguishes ILP from more traditional approaches that build rules on the basis of global molecular properties.

  18. The Extraction of Heavy Metals by Means of a New Electrolytic Method

    International Nuclear Information System (INIS)

    Guiragossian, Z. G.; Martoyan, G. A.; Injeyan, S. G.; Tonikyan, S. G.; Nalbandyan, G. G.

    2003-01-01

    The extraction of metals in known metallurgical methods is pursued on the basis of separating as much as possible the desired metal's content from the ore concentrate, in the most economical manner. When these principles are also applied to the extraction of heavy metals, the related environmental factors do not readily meet with requirements. Today, an acceptable extraction technology for metals must satisfy the need to produce the deep separation of metals from their source in both economical and environmentally safe manner. This pertains to the direction of our ongoing research and development, among others in the field of environmental remediation. Earlier, we successfully addressed in an environmentally safe manner the selective extraction of radioactive isotopes from liquid radioactive wastes, produced at Armenia's Metzamor Nuclear Power Plant and implemented a functioning LRW station at the NPP. Currently, we extended our new electrodialysis-based electrolytic method in a laboratory scale, for the extraction and deep separation of different metals, including the heavy metals. Our new method, its efficiency, economy and full compliance with environmental issues will be presented

  19. METHODS FOR PORE WATER EXTRACTION FROM UNSATURATED ZONE TUFF, YUCCA MOUNTAIN, NEVADA

    International Nuclear Information System (INIS)

    K.M. SCOFIELD

    2006-01-01

    Assessing the performance of the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, requires an understanding of the chemistry of the water that moves through the host rock. The uniaxial compression method used to extract pore water from samples of tuffaceous borehole core was successful only for nonwelded tuff. An ultracentrifugation method was adopted to extract pore water from samples of the densely welded tuff of the proposed repository horizon. Tests were performed using both methods to determine the efficiency of pore water extraction and the potential effects on pore water chemistry. Test results indicate that uniaxial compression is most efficient for extracting pore water from nonwelded tuff, while ultracentrifugation is more successful in extracting pore water from densely welded tuff. Pore water splits taken from a single nonwelded tuff core during uniaxial compression tests have shown changes in pore water chemistry with increasing pressure for calcium, chloride, sulfate, and nitrate, while the chemistry of pore water splits from welded and nonwelded tuffs using ultracentrifugation indicates that there is no significant fractionation of solutes

  20. A comparison of tissue preparation methods for protein extraction of cocoa (Theobroma cacao L. pod

    Directory of Open Access Journals (Sweden)

    Ascensión Martínez-Márquez

    2017-04-01

    Full Text Available Cocoa, Theobroma cacao L. is one of the main tropical industrial crops. Cocoa has a very high level of interfering substances, such as polysaccharides and phenolic compounds that could prevent the isolation of suitable protein. Efficient methods of protein extraction are a priority to successfully apply proteomic analyses. We compared and evaluated two methods (A and B of tissue preparation for total protein extract by phenol/SDS extraction protocol. The difference in the application of the two methods was that extensively washed dry powder of pod tissue were made in Method A, whereas that crude extract were prepared Method B. Extracted proteins were examined using one-dimensional electrophoresis (1-D. Results show that each extraction method isolated a unique subset of cocoa pod proteome. Principal component analysis showed little variation in the data obtained using Method A, while that in Methods B showed no low reproducibility, thus demonstrating that Method A is a reliable for preparing cocoa pod proteins. The protocol is expected to be applicable to other recalcitrant plant tissues and to be of interest to laboratories involved in plant proteomics analyses. A combination of extraction approaches is recommended for increasing proteome coverage when using gel-based isolation techniques.

  1. Chemical composition of mate tea leaves (Ilex paraguariensis): a study of extraction methods.

    Science.gov (United States)

    Assis Jacques, Rosângela; dos Santos Freitas, Lisiane; Flores Peres, Valéria; Dariva, Cláudio; de Oliveira, José Vladimir; Bastos Caramão, Elina

    2006-12-01

    The objective of this work was to investigate the extraction of Ilex paraguariensis leaves by means of three extraction techniques: pressurized liquid extraction (PLE, also called accelerated solvent extraction--ASE), maceration, and sonication. Samples of mate tea leaves were collected from an experiment conducted under agronomic control at Indfistria e Comércio de Erva-Mate Barão LTDA, Brazil. Six solvents with increasing polarities (n-hexane, toluene, dichloromethane, ethyl acetate, acetone, and methanol) were used in this investigation. Chemical analysis of the extracts was performed by GC coupled with a mass spectrometer detector. The identification and quantification were accomplished by coinjections of certified standards. The results showed that no significant differences in the qualities of the extracts were noticed regarding the extraction methods. On the other hand, the PLE technique was found to be more effective for the extractions of caffeine, phytol, palmitic, and stearic acid. The use of PLE led to a significant decrease in the total extraction time, amount of solvent consumption, and manipulation of samples compared to maceration and ultrasound-assisted extraction methods.

  2. QCD Sum Rules, a Modern Perspective

    CERN Document Server

    Colangelo, Pietro; Colangelo, Pietro; Khodjamirian, Alexander

    2001-01-01

    An introduction to the method of QCD sum rules is given for those who want to learn how to use this method. Furthermore, we discuss various applications of sum rules, from the determination of quark masses to the calculation of hadronic form factors and structure functions. Finally, we explain the idea of the light-cone sum rules and outline the recent development of this approach.

  3. Symbolic methods for the evaluation of sum rules of Bessel functions

    International Nuclear Information System (INIS)

    Babusci, D.; Dattoli, G.; Górska, K.; Penson, K. A.

    2013-01-01

    The use of the umbral formalism allows a significant simplification of the derivation of sum rules involving products of special functions and polynomials. We rederive in this way known sum rules and addition theorems for Bessel functions. Furthermore, we obtain a set of new closed form sum rules involving various special polynomials and Bessel functions. The examples we consider are relevant for applications ranging from plasma physics to quantum optics

  4. Development of a modified cortisol extraction procedure for intermediately sized fish not amenable to whole-body or plasma extraction methods.

    Science.gov (United States)

    Guest, Taylor W; Blaylock, Reginald B; Evans, Andrew N

    2016-02-01

    The corticosteroid hormone cortisol is the central mediator of the teleost stress response. Therefore, the accurate quantification of cortisol in teleost fishes is a vital tool for addressing fundamental questions about an animal's physiological response to environmental stressors. Conventional steroid extraction methods using plasma or whole-body homogenates, however, are inefficient within an intermediate size range of fish that are too small for phlebotomy and too large for whole-body steroid extractions. To assess the potential effects of hatchery-induced stress on survival of fingerling hatchery-reared Spotted Seatrout (Cynoscion nebulosus), we developed a novel extraction procedure for measuring cortisol in intermediately sized fish (50-100 mm in length) that are not amenable to standard cortisol extraction methods. By excising a standardized portion of the caudal peduncle, this tissue extraction procedure allows for a small portion of a larger fish to be sampled for cortisol, while minimizing the potential interference from lipids that may be extracted using whole-body homogenization procedures. Assay precision was comparable to published plasma and whole-body extraction procedures, and cortisol quantification over a wide range of sample dilutions displayed parallelism versus assay standards. Intra-assay %CV was 8.54%, and average recovery of spiked samples was 102%. Also, tissue cortisol levels quantified using this method increase 30 min after handling stress and are significantly correlated with blood values. We conclude that this modified cortisol extraction procedure provides an excellent alternative to plasma and whole-body extraction procedures for intermediately sized fish, and will facilitate the efficient assessment of cortisol in a variety of situations ranging from basic laboratory research to industrial and field-based environmental health applications.

  5. Method for Extracting and Sequestering Carbon Dioxide

    Energy Technology Data Exchange (ETDEWEB)

    Rau, Gregory H.; Caldeira, Kenneth G.

    2005-05-10

    A method and apparatus to extract and sequester carbon dioxide (CO2) from a stream or volume of gas wherein said method and apparatus hydrates CO2, and reacts the resulting carbonic acid with carbonate. Suitable carbonates include, but are not limited to, carbonates of alkali metals and alkaline earth metals, preferably carbonates of calcium and magnesium. Waste products are metal cations and bicarbonate in solution or dehydrated metal salts, which when disposed of in a large body of water provide an effective way of sequestering CO2 from a gaseous environment.

  6. Study on the Method of Association Rules Mining Based on Genetic Algorithm and Application in Analysis of Seawater Samples

    Directory of Open Access Journals (Sweden)

    Qiuhong Sun

    2014-04-01

    Full Text Available Based on the data mining research, the data mining based on genetic algorithm method, the genetic algorithm is briefly introduced, while the genetic algorithm based on two important theories and theoretical templates principle implicit parallelism is also discussed. Focuses on the application of genetic algorithms for association rule mining method based on association rule mining, this paper proposes a genetic algorithm fitness function structure, data encoding, such as the title of the improvement program, in particular through the early issues study, proposed the improved adaptive Pc, Pm algorithm is applied to the genetic algorithm, thereby improving efficiency of the algorithm. Finally, a genetic algorithm based association rule mining algorithm, and be applied in sea water samples database in data mining and prove its effective.

  7. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain

    OpenAIRE

    Hassanpour, Saeed; O?Connor, Martin J; Das, Amar K

    2013-01-01

    Background A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based t...

  8. Extraction of Stevia rebaudiana bertoni sweetener glycosides by supercritical fluid methods.

    Directory of Open Access Journals (Sweden)

    Juan José Hinojosa-González

    2017-05-01

    Full Text Available Aim. The aim was to evaluate the supercritical carbon dioxide extraction method with and without the addition of co-solvent to the system (mixture water: ethanol to obtain the glycosides from leaves of Stevia rebaudiana Bertoni. Methods. A SFT-150 SFE / SFR model with CO2 as a fluid was used for the supercritical extraction. The variables studied were temperature, pressure, extraction time and the presence or absence of the co-solvent (water-ethanol mixture in a concentration of 70:30 v/v, incorporated in different proportions to determine the effect on yield. The amount of glycoside sweeteners was analyzed by High Performance Liquid Chromatography (HPLC. Results. The pressure was the factor that favored the extraction, which was selective in obtaining Rebaudioside A with yields no greater than 2%. The inclusion of the co-solvent achieved an increase in yield to values of 2.9% Conclusion. Supercritical CO2 individually and mixed with ethanol-water as a co-solvent was not efficient to extract Stevia rebaudiana stevioside sweeteners

  9. EXTRACTION OF ASTAXANTHIN ESTERS FROM SHRIMP WASTE BY CHEMICAL AND MICROBIAL METHODS

    Directory of Open Access Journals (Sweden)

    A. Khanafari, A. Saberi, M. Azar, Gh. Vosooghi, Sh. Jamili, B. Sabbaghzadeh

    2007-04-01

    Full Text Available The carotenoid pigments specifically astaxanthin has many significant applications in food, pharmaceutical and cosmetic industries. The goal of this research was the extraction of Astaxanthin from a certain Persian Gulf shrimp species waste (Penaeus semisulcatus, purification and identification of the pigment by chemical and microbial methods. Microbial fermentation was obtained by inoculation of two Lactobacillus species Lb. plantarum and Lb. acidophilus in the medium culture containing shrimp waste powder by the intervention of lactose sugar, yeast extract, the composition of Both and the coolage (-20oC. The carotenoids were extracted by an organic solvent system. After purification of astaxanthin with the thin layer chromatography method by spectrophotometer, NMR and IR analysis the presence of astaxanthin esters was recognized in this specific species of Persian Gulf shrimp. Results obtained from this study showed that the coolage at –20 oC not only does not have an amplifying effect on the production of astaxanthin but also slightly reduces this effect. Also the effect of intervention of lactose sugar showed more effectiveness in producing astaxanthin than yeast extract or more than with the presence of both. The results also indicated that there is not much difference in the ability of producing the pigment by comparing both Lb. plantarum and Lb. acidophillus. Also results showed the microbial method of extraction of astaxanthin is more effective than chemical method. The pigment extracted from certain amount of shrimp powder, 23.128 mg/g, was calculated.

  10. EXTRACTION OF ROOF LINES FROM HIGH-RESOLUTION IMAGES BY A GROUPING METHOD

    Directory of Open Access Journals (Sweden)

    A. P. Dal Poz

    2016-06-01

    Full Text Available This paper proposes a method for extracting groups of straight lines that represent roof boundaries and roof ridgelines from highresolution aerial images using corresponding Airborne Laser Scanner (ALS roof polyhedrons as initial approximations. The proposed method is based on two main steps. First, straight lines that are candidates to represent roof ridgelines and roof boundaries of a building are extracted from the aerial image. Second, a group of straight lines that represent roof boundaries and roof ridgelines of a selected building is obtained through the optimization of a Markov Random Field (MRF-based energy function using the genetic algorithm optimization method. The formulation of this energy function considers several attributes, such as the proximity of the extracted straight lines to the corresponding projected ALS-derived roof polyhedron and the rectangularity (extracted straight lines that intersect at nearly 90°. Experimental results are presented and discussed in this paper.

  11. Generating Concise Rules for Human Motion Retrieval

    Science.gov (United States)

    Mukai, Tomohiko; Wakisaka, Ken-Ichi; Kuriyama, Shigeru

    This paper proposes a method for retrieving human motion data with concise retrieval rules based on the spatio-temporal features of motion appearance. Our method first converts motion clip into a form of clausal language that represents geometrical relations between body parts and their temporal relationship. A retrieval rule is then learned from the set of manually classified examples using inductive logic programming (ILP). ILP automatically discovers the essential rule in the same clausal form with a user-defined hypothesis-testing procedure. All motions are indexed using this clausal language, and the desired clips are retrieved by subsequence matching using the rule. Such rule-based retrieval offers reasonable performance and the rule can be intuitively edited in the same language form. Consequently, our method enables efficient and flexible search from a large dataset with simple query language.

  12. New method for extracting tumors in PET/CT images based on the probability distribution

    International Nuclear Information System (INIS)

    Nitta, Shuhei; Hontani, Hidekata; Hukami, Tadanori

    2006-01-01

    In this report, we propose a method for extracting tumors from PET/CT images by referring to the probability distribution of pixel values in the PET image. In the proposed method, first, the organs that normally take up fluorodeoxyglucose (FDG) (e.g., the liver, kidneys, and brain) are extracted. Then, the tumors are extracted from the images. The distribution of pixel values in PET images differs in each region of the body. Therefore, the threshold for detecting tumors is adaptively determined by referring to the distribution. We applied the proposed method to 37 cases and evaluated its performance. This report also presents the results of experiments comparing the proposed method and another method in which the pixel values are normalized for extracting tumors. (author)

  13. EFFECTS OF EXTRACTION METHODS ON PHYSICO-CHEMICAL ...

    African Journals Online (AJOL)

    The relative density value ranged from 0.9 to 0.92 at 29°C (room temperature). Both oil samples were in liquid state at room temperature and boiling points varied from 94°C-to 98°C for solvent extracted oil and hydraulic press oil respectively. The results showed thatJhe method ofextraction imposed significant changes on ...

  14. Multifocus Image Fusion in Q-Shift DTCWT Domain Using Various Fusion Rules

    Directory of Open Access Journals (Sweden)

    Yingzhong Tian

    2016-01-01

    Full Text Available Multifocus image fusion is a process that integrates partially focused image sequence into a fused image which is focused everywhere, with multiple methods proposed in the past decades. The Dual Tree Complex Wavelet Transform (DTCWT is one of the most precise ones eliminating two main defects caused by the Discrete Wavelet Transform (DWT. Q-shift DTCWT was proposed afterwards to simplify the construction of filters in DTCWT, producing better fusion effects. A different image fusion strategy based on Q-shift DTCWT is presented in this work. According to the strategy, firstly, each image is decomposed into low and high frequency coefficients, which are, respectively, fused by using different rules, and then various fusion rules are innovatively combined in Q-shift DTCWT, such as the Neighborhood Variant Maximum Selectivity (NVMS and the Sum Modified Laplacian (SML. Finally, the fused coefficients could be well extracted from the source images and reconstructed to produce one fully focused image. This strategy is verified visually and quantitatively with several existing fusion methods based on a plenty of experiments and yields good results both on standard images and on microscopic images. Hence, we can draw the conclusion that the rule of NVMS is better than others after Q-shift DTCWT.

  15. A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.

    Science.gov (United States)

    John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192). We have developed a simple, mild extraction procedure using methanol which, when...

  16. Extraction and determination of arsenic species in leafy vegetables: Method development and application.

    Science.gov (United States)

    Ma, Li; Yang, Zhaoguang; Kong, Qian; Wang, Lin

    2017-02-15

    Extraction of arsenic (As) species in leafy vegetables was investigated by different combinations of methods and extractants. The extracted As species were separated and determined by HPLC-ICP-MS method. The microwave assisted method using 1% HNO3 as the extractant exhibited satisfactory efficiency (>90%) at 90°C for 1.5h. The proposed method was applied for extracting As species from real leafy vegetables. Thirteen cultivars of leafy vegetables were collected and analyzed. The predominant species in all the investigated vegetable samples were As(III) and As(V). Moreover, both As(III) and As(V) concentrations were positive significant (p<0.01) correlated with total As (tAs) concentration. However, the percentage of As(V) reduced with tAs concentration increasing probably due to the conversion and transformation of As(V) to As(III) after uptake. The hazard quotient results indicated no particular risk to 94.6% of local consumers. Considerably carcinogenic risk by consumption of the leafy vegetables was observed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Rule set transferability for object-based feature extraction

    NARCIS (Netherlands)

    Anders, N.S.; Seijmonsbergen, Arie C.; Bouten, Willem

    2015-01-01

    Cirques are complex landforms resulting from glacial erosion and can be used to estimate Equilibrium Line Altitudes and infer climate history. Automated extraction of cirques may help research on glacial geomorphology and climate change. Our objective was to test the transferability of an

  18. Source of spill ripple in the RF-KO slow-extraction method with FM and AM

    CERN Document Server

    Noda, K; Shibuya, S; Muramatsu, M; Uesugi, T; Kanazawa, M; Torikoshi, M; Takada, E; Yamada, S

    2002-01-01

    The RF-knockout (RF-KO) slow-extraction method with frequency modulation (FM) and amplitude modulation (AM) has brought high-accuracy irradiation to the treatment of a cancer tumor moving with respiration, because of a quick response to beam start/stop. However, a beam spill extracted from a synchrotron ring through RF-KO slow-extraction has a huge ripple with a frequency of around 1 kHz related to the FM. The spill ripple will disturb the lateral dose distribution in the beam scanning methods. Thus, the source of the spill ripple has been investigated through experiments and simulations. There are two tune regions for the extraction process through the RF-KO method: the extraction region and the diffusion region. The particles in the extraction region can be extracted due to amplitude growth through the transverse RF field, only when its frequency matches with the tune in the extraction region. For a large chromaticity, however, the particles in the extraction region can be extracted through the synchrotron ...

  19. Comparison of methods of DNA extraction for real-time PCR in a model of pleural tuberculosis.

    Science.gov (United States)

    Santos, Ana; Cremades, Rosa; Rodríguez, Juan Carlos; García-Pachón, Eduardo; Ruiz, Montserrat; Royo, Gloria

    2010-01-01

    Molecular methods have been reported to have different sensitivities in the diagnosis of pleural tuberculosis and this may in part be caused by the use of different methods of DNA extraction. Our study compares nine DNA extraction systems in an experimental model of pleural tuberculosis. An inoculum of Mycobacterium tuberculosis was added to 23 pleural liquid samples with different characteristics. DNA was subsequently extracted using nine different methods (seven manual and two automatic) for analysis with real-time PCR. Only two methods were able to detect the presence of M. tuberculosis DNA in all the samples: extraction using columns (Qiagen) and automated extraction with the TNAI system (Roche). The automatic method is more expensive, but requires less time. Almost all the false negatives were because of the difficulty involved in extracting M. tuberculosis DNA, as in general, all the methods studied are capable of eliminating inhibitory substances that block the amplification reaction. The method of M. tuberculosis DNA extraction used affects the results of the diagnosis of pleural tuberculosis by molecular methods. DNA extraction systems that have been shown to be effective in pleural liquid should be used.

  20. Decision mining revisited - Discovering overlapping rules

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  1. Decision Mining Revisited - Discovering Overlapping Rules

    NARCIS (Netherlands)

    Mannhardt, F.; De Leoni, M.; Reijers, H.A.; van der Aalst, W.M.P.; Nurcan, S.; Soffer, P.; Bajec, M.; Eder, J.

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  2. Genomic DNA extraction method from Annona senegalensis Pers ...

    African Journals Online (AJOL)

    Extraction of DNA in many plants is difficult because of the presence of metabolites that interfere with DNA isolation procedures and downstream applications such as DNA restriction, replications, amplification, as well as cloning. Modified procedure based on the hexadecyltrimethyl ammoniumbromide (CTAB) method is ...

  3. 7 CFR 51.1179 - Method of juice extraction.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Florida Oranges and Tangelos Standards for Internal Quality... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of juice extraction. 51.1179 Section 51.1179 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards...

  4. Comparison of Chemical Extraction Methods for Determination of Soil Potassium in Different Soil Types

    Science.gov (United States)

    Zebec, V.; Rastija, D.; Lončarić, Z.; Bensa, A.; Popović, B.; Ivezić, V.

    2017-12-01

    Determining potassium supply of soil plays an important role in intensive crop production, since it is the basis for balancing nutrients and issuing fertilizer recommendations for achieving high and stable yields within economic feasibility. The aim of this study was to compare the different extraction methods of soil potassium from arable horizon of different types of soils with ammonium lactate method (KAL), which is frequently used as analytical method for determining the accessibility of nutrients and it is a common method used for issuing fertilizer recommendations in many Europe countries. In addition to the ammonium lactate method (KAL, pH 3.75), potassium was extracted with ammonium acetate (KAA, pH 7), ammonium acetate ethylenediaminetetraacetic acid (KAAEDTA, pH 4.6), Bray (KBRAY, pH 2.6) and with barium chloride (K_{BaCl_2 }, pH 8.1). The analyzed soils were extremely heterogeneous with a wide range of determined values. Soil pH reaction ( {pH_{H_2 O} } ) ranged from 4.77 to 8.75, organic matter content ranged from 1.87 to 4.94% and clay content from 8.03 to 37.07%. In relation to KAL method as the standard method, K_{BaCl_2 } method extracts 12.9% more on average of soil potassium, while in relation to standard method, on average KAA extracts 5.3%, KAAEDTA 10.3%, and KBRAY 27.5% less of potassium. Comparison of analyzed extraction methods of potassium from the soil is of high precision, and most reliable comparison was KAL method with KAAEDTA, followed by a: KAA, K_{BaCl_2 } and KBRAY method. Extremely significant statistical correlation between different extractive methods for determining potassium in the soil indicates that any of the methods can be used to accurately predict the concentration of potassium in the soil, and that carried out research can be used to create prediction model for concentration of potassium based on different methods of extraction.

  5. Efficient method for extracting DNA of parasites causing bovine babesiosis from tick vectors

    Science.gov (United States)

    The southern cattle tick, Rhipicephalus (Boophilus) microplus, is an economically important pest costing animal agriculture billions of dollars worldwide. This research focuses on a comparison of three different tick DNA extraction methods: phenol-chloroform extraction (method 1), a modified version...

  6. Impact of two different commercial DNA extraction methods on BK virus viral load

    Directory of Open Access Journals (Sweden)

    Massimiliano Bergallo

    2016-03-01

    Full Text Available Background and aim: BK virus, a member of human polyomavirus family, is a worldwide distributed virus characterized by a seroprevalence rate of 70-90% in adult population. Monitoring of viral replication is made by evaluation of BK DNA by quantitative polymerase chain reaction. Many different methods can be applied for extraction of nucleic acid from several specimens. The aim of this study was to assess the impact of two different DNA extraction procedure on BK viral load. Materials and methods: DNA extraction procedure including the Nuclisens easyMAG platform (bioMerieux, Marcy l’Etoile, France and manual QIAGEN extraction (QIAGEN Hilden, Germany. BK DNA quantification was performed by Real Time TaqMan PCR using a commercial kit. Result and discussion: The samples capacity, cost and time spent were compared for both systems. In conclusion our results demonstrate that automated nucleic acid extraction method using Nuclisense easyMAG was superior to manual protocol (QIAGEN Blood Mini kit, for the extraction of BK virus from serum and urine specimens.

  7. A Method of Road Extraction from High-resolution Remote Sensing Images Based on Shape Features

    Directory of Open Access Journals (Sweden)

    LEI Xiaoqi

    2016-02-01

    Full Text Available Road extraction from high-resolution remote sensing image is an important and difficult task.Since remote sensing images include complicated information,the methods that extract roads by spectral,texture and linear features have certain limitations.Also,many methods need human-intervention to get the road seeds(semi-automatic extraction,which have the great human-dependence and low efficiency.The road-extraction method,which uses the image segmentation based on principle of local gray consistency and integration shape features,is proposed in this paper.Firstly,the image is segmented,and then the linear and curve roads are obtained by using several object shape features,so the method that just only extract linear roads are rectified.Secondly,the step of road extraction is carried out based on the region growth,the road seeds are automatic selected and the road network is extracted.Finally,the extracted roads are regulated by combining the edge information.In experiments,the images that including the better gray uniform of road and the worse illuminated of road surface were chosen,and the results prove that the method of this study is promising.

  8. Novel Approaches to Extraction Methods in Recovery of Capsaicin from Habanero Pepper (CNPH 15.192).

    Science.gov (United States)

    Martins, Frederico S; Borges, Leonardo L; Ribeiro, Claudia S C; Reifschneider, Francisco J B; Conceição, Edemilson C

    2017-07-01

    The objective of this study was to compare three capsaicin extraction methods: Shoxlet, Ultrasound-assisted Extraction (UAE), and Shaker-assisted Extraction (SAE) from Habanero pepper, CNPH 15.192. The different parameters evaluated were alcohol degree, time extraction, and solid-solvent ratio using response surface methodology (RSM). The three parameters found significant ( p Soxhlet increased the extraction in 10-25%; however, long extraction times (45 minutes) degraded 2% capsaicin. The extraction of capsaicin was influenced by extraction method and by the operating conditions chosen. The optimized conditions provided savings of time, solvent, and herbal material. Prudent choice of the extraction method is essential to ensure optimal yield of extract, thereby making the study relevant and the knowledge gained useful for further exploitation and application of this resource. Habanero pepper , line CNPH 15.192, possess capsaicin in higher levels when compared with others speciesHigher levels of ethanolic strength are more suitable to obtain a higher levels of capsaicinBox-Behnken design indicates to be useful to explore the best conditions of ultrasound assisted extraction of capsaicin. Abbreviations used: Nomenclature UAE: Ultrasound-assisted Extraction; SAE: Shaker-assisted Extraction.

  9. IN VITRO ANTIMALARIAL ACTIVITY OF THE EXTRACTS OF ...

    African Journals Online (AJOL)

    Administrator

    a significant inhibition in schizont maturation relative to control (P = 0.05). Ethanolic extract .... Animals were handled according to local rules and regulation of ... graph of percentage mortality converted to probit against log-dose of the extract ...

  10. Advanced RF-KO slow-extraction method for the reduction of spill ripple

    CERN Document Server

    Noda, K; Shibuya, S; Uesugi, T; Muramatsu, M; Kanazawa, M; Takada, E; Yamada, S

    2002-01-01

    Two advanced RF-knockout (RF-KO) slow-extraction methods have been developed at HIMAC in order to reduce the spill ripple for accurate heavy-ion cancer therapy: the dual frequency modulation (FM) method and the separated function method. As a result of simulations and experiments, it was verified that the spill ripple could be considerably reduced using these advanced methods, compared with the ordinary RF-KO method. The dual FM method and the separated function method bring about a low spill ripple within standard deviations of around 25% and of 15% during beam extraction within around 2 s, respectively, which are in good agreement with the simulation results.

  11. Highly efficient DNA extraction method from skeletal remains

    Directory of Open Access Journals (Sweden)

    Irena Zupanič Pajnič

    2011-03-01

    Full Text Available Background: This paper precisely describes the method of DNA extraction developed to acquire high quality DNA from the Second World War skeletal remains. The same method is also used for molecular genetic identification of unknown decomposed bodies in routine forensic casework where only bones and teeth are suitable for DNA typing. We analysed 109 bones and two teeth from WWII mass graves in Slovenia. Methods: We cleaned the bones and teeth, removed surface contaminants and ground the bones into powder, using liquid nitrogen . Prior to isolating the DNA in parallel using the BioRobot EZ1 (Qiagen, the powder was decalcified for three days. The nuclear DNA of the samples were quantified by real-time PCR method. We acquired autosomal genetic profiles and Y-chromosome haplotypes of the bones and teeth with PCR amplification of microsatellites, and mtDNA haplotypes 99. For the purpose of traceability in the event of contamination, we prepared elimination data bases including genetic profiles of the nuclear and mtDNA of all persons who have been in touch with the skeletal remains in any way. Results: We extracted up to 55 ng DNA/g of the teeth, up to 100 ng DNA/g of the femurs, up to 30 ng DNA/g of the tibias and up to 0.5 ng DNA/g of the humerus. The typing of autosomal and YSTR loci was successful in all of the teeth, in 98 % dekalof the femurs, and in 75 % to 81 % of the tibias and humerus. The typing of mtDNA was successful in all of the teeth, and in 96 % to 98 % of the bones. Conclusions: We managed to obtain nuclear DNA for successful STR typing from skeletal remains that were over 60 years old . The method of DNA extraction described here has proved to be highly efficient. We obtained 0.8 to 100 ng DNA/g of teeth or bones and complete genetic profiles of autosomal DNA, Y-STR haplotypes, and mtDNA haplotypes from only 0.5g bone and teeth samples.

  12. [Study on biopharmaceutics classification system for Chinese materia medica of extract of Huanglian].

    Science.gov (United States)

    Liu, Yang; Yin, Xiu-Wen; Wang, Zi-Yu; Li, Xue-Lian; Pan, Meng; Li, Yan-Ping; Dong, Ling

    2017-11-01

    One of the advantages of biopharmaceutics classification system of Chinese materia medica (CMMBCS) is expanding the classification research level from single ingredient to multi-components of Chinese herb, and from multi-components research to holistic research of the Chinese materia medica. In present paper, the alkaloids of extract of huanglian were chosen as the main research object to explore their change rules in solubility and intestinal permeability of single-component and multi-components, and to determine the biopharmaceutical classification of extract of Huanglian from holistic level. The typical shake-flask method and HPLC were used to detect the solubility of single ingredient of alkaloids from extract of huanglian. The quantitative research of alkaloids in intestinal absorption was measured in single-pass intestinal perfusion experiment while permeability coefficient of extract of huanglian was calculated by self-defined weight coefficient method. Copyright© by the Chinese Pharmaceutical Association.

  13. Genomic DNA extraction method from pearl millet ( Pennisetum ...

    African Journals Online (AJOL)

    DNA extraction is difficult in a variety of plants because of the presence of metabolites that interfere with DNA isolation procedures and downstream applications such as DNA restriction, amplification, and cloning. Here we describe a modified procedure based on the hexadecyltrimethylammonium bromide (CTAB) method to ...

  14. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  15. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  16. An effective method for extraction and polymerase chain reaction ...

    African Journals Online (AJOL)

    The PCR amplification with the NaOH and PBS treatment had a success rate of 30 to 100% for both mitochondrial and nuclear markers. The PBS method is the best method for extraction of DNA from formalin-preserved samples of longer period (two to seven years) because of higher success rate in amplifying mitochondrial ...

  17. [DNA extraction from decomposed tissue by double-digest and magnetic beads methods].

    Science.gov (United States)

    Yang, Dian; Liu, Chao; Liu, Hong

    2011-12-01

    To study the effect of the double-digest and magnetic beads method for DNA extraction from 3 types of decomposed tissues. DNA of cartilages, nails and joint capsule in 91 highly decomposed corpses which had not been extracted by common magnetic beads method, were prepared with the double-digest and magnetic beads methods, and quantified with Quantifiler kit, followed by amplification with Sinofiler kit or Minifiler kit. DNA concentration extracted from the 91 highly decomposed cartilages, nails and joint capsule samples was 0-0.225 ng/microL. Sixty-two samples whose DNA concentration were more than 0.020 ng/microL had obtained 9 or more STR loci successfully. The detection rate was 68.13%. The successful rate of STR genotyping for the 3 types of decomposed tissues can be significantly improved by the double-digest and magnetic beads methods.

  18. Effect of extraction methods on property and bioactivity of water-soluble polysaccharides from Amomum villosum.

    Science.gov (United States)

    Yan, Yajuan; Li, Xia; Wan, Mianjie; Chen, Jingping; Li, Shijie; Cao, Man; Zhang, Danyan

    2015-03-06

    In the present study, effect of different extraction methods on property and bioactivity of water-soluble polysaccharides (WSP) from the seeds of Amomum villosum were investigated. Firstly, four different extraction methods were used to extract WSP, which include hot water extraction (HWE), ultrasonic-assisted extraction (UAE), microwave-assisted extraction (MAE) and enzyme-assisted extraction (EAE). As a result, four WSP samples, WSP(H), WSP(U), WSP(M) and WSP(E) were acquired. Then, the difference of four WSP samples in yield, characterization and antioxidant activities in vitro were further compared. Experimental results showed that the four WSP samples had the same monosaccharide composition, but mere difference in the content; they all had typical IR spectra characteristic of polysaccharides. WSP(U) contained the highest contents of uronic acid and sulfate. The yield of WSP(U) was the highest and its antioxidant activity was the best. These results suggested that ultrasonic-assisted extraction was the best extraction method for WSP. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A MISO-ARX-Based Method for Single-Trial Evoked Potential Extraction

    Directory of Open Access Journals (Sweden)

    Nannan Yu

    2017-01-01

    Full Text Available In this paper, we propose a novel method for solving the single-trial evoked potential (EP estimation problem. In this method, the single-trial EP is considered as a complex containing many components, which may originate from different functional brain sites; these components can be distinguished according to their respective latencies and amplitudes and are extracted simultaneously by multiple-input single-output autoregressive modeling with exogenous input (MISO-ARX. The extraction process is performed in three stages: first, we use a reference EP as a template and decompose it into a set of components, which serve as subtemplates for the remaining steps. Then, a dictionary is constructed with these subtemplates, and EPs are preliminarily extracted by sparse coding in order to roughly estimate the latency of each component. Finally, the single-trial measurement is parametrically modeled by MISO-ARX while characterizing spontaneous electroencephalographic activity as an autoregression model driven by white noise and with each component of the EP modeled by autoregressive-moving-average filtering of the subtemplates. Once optimized, all components of the EP can be extracted. Compared with ARX, our method has greater tracking capabilities of specific components of the EP complex as each component is modeled individually in MISO-ARX. We provide exhaustive experimental results to show the effectiveness and feasibility of our method.

  20. a Landmark Extraction Method Associated with Geometric Features and Location Distribution

    Science.gov (United States)

    Zhang, W.; Li, J.; Wang, Y.; Xiao, Y.; Liu, P.; Zhang, S.

    2018-04-01

    Landmark plays an important role in spatial cognition and spatial knowledge organization. Significance measuring model is the main method of landmark extraction. It is difficult to take account of the spatial distribution pattern of landmarks because that the significance of landmark is built in one-dimensional space. In this paper, we start with the geometric features of the ground object, an extraction method based on the target height, target gap and field of view is proposed. According to the influence region of Voronoi Diagram, the description of target gap is established to the geometric representation of the distribution of adjacent targets. Then, segmentation process of the visual domain of Voronoi K order adjacent is given to set up target view under the multi view; finally, through three kinds of weighted geometric features, the landmarks are identified. Comparative experiments show that this method has a certain coincidence degree with the results of traditional significance measuring model, which verifies the effectiveness and reliability of the method and reduces the complexity of landmark extraction process without losing the reference value of landmark.

  1. Layout-aware text extraction from full-text PDF of scientific articles.

    Science.gov (United States)

    Ramakrishnan, Cartic; Patnia, Abhishek; Hovy, Eduard; Burns, Gully Apc

    2012-05-28

    The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the 'Layout-Aware PDF Text Extraction' (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for

  2. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules.

    Science.gov (United States)

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present an efficient algorithm for finding rules with high PNSR scores. The experiments demonstrate that our method is able to cover and explain the data with a much smaller rule set than existing methods.

  3. Source of spill ripple in the RF-KO slow-extraction method with FM and AM

    International Nuclear Information System (INIS)

    Noda, K.; Furukawa, T.; Shibuya, S.; Muramatsu, M.; Uesugi, T.; Kanazawa, M.; Torikoshi, M.; Takada, E.; Yamada, S.

    2002-01-01

    The RF-knockout (RF-KO) slow-extraction method with frequency modulation (FM) and amplitude modulation (AM) has brought high-accuracy irradiation to the treatment of a cancer tumor moving with respiration, because of a quick response to beam start/stop. However, a beam spill extracted from a synchrotron ring through RF-KO slow-extraction has a huge ripple with a frequency of around 1 kHz related to the FM. The spill ripple will disturb the lateral dose distribution in the beam scanning methods. Thus, the source of the spill ripple has been investigated through experiments and simulations. There are two tune regions for the extraction process through the RF-KO method: the extraction region and the diffusion region. The particles in the extraction region can be extracted due to amplitude growth through the transverse RF field, only when its frequency matches with the tune in the extraction region. For a large chromaticity, however, the particles in the extraction region can be extracted through the synchrotron oscillation, even when the frequency does not match with the tune in the extraction region. Thus, the spill structure during one period of the FM strongly depends on the horizontal chromaticity. They are repeated with the repetition frequency of the FM, which is the very source of the spill ripple in the RF-KO method

  4. Analytical method for Dioxin and Organo-Chlorinated Compounds:(II) Comparison of Extraction Methods of Dioxins from XAD-2 Adsorbent

    International Nuclear Information System (INIS)

    Yang, Jeong Soo; Lee, Sung Kwang; Park, Young Hun; Lee, Dai Woon

    1999-01-01

    Supercritical fluid extraction (SFE), ultrasonic extraction (USE), and accelerated solvent extraction (ASE) were compared with the well known Soxhlet extraction for the extraction of polychlorinated biphenyls (PCB s ) and polychlorinated dibenzo-p-dioxins (PCDD s ) from the XAD-2 resin which was used to adsorb PCDD s in the atmosphere. XAD-2 resin spiked with five PCDD s was chosen as a sample. The optimum conditions for the extraction of PCDD s by SFE were turned out to be the use of CO 2 modified with 10% toluene at 100 .deg. C and 350 atm, with 5 min static extraction followed by 20 min dynamic extraction. SFE gave a good extraction rate with good reproducibility for PCDD s ranging from 68 to 98%. The ultrasonic extraction of PCDD s from XAD-2 was investigated and compared with other extractions. A probe type method was compared with a bath type. Two extraction solvents, toluene and acetone were compared with their mixture. The use of their mixture in probe type, with 9 minutes of extraction time, was found to be the optimum condition. The average recovery of the five PCDD s for USE was 82-93%. Accelerated solvent extraction (ASE) with a liquid solvent, a new technique for sample preparation, was performed under elevated temperatures and pressures. The effect of temperature on the efficiency of ASE was investigated. The extraction time for a 10 g sample was less than 15 min, when the organic solvent was η-hexaneacetone mixture (1 : 1, v/v). Using ASE, the average recoveries of five PCDD s ranged from 90 to 103%. SFE, USE, and ASE were faster and less laborious than Soxhlet extraction. The former three methods required less solvent than Soxhlet extraction. SFE required no concentration of the solvent extracts. SFE and ASE failed to perform simultaneous parallel extractions because of instrumental limitations

  5. Comparison of RNA extraction methods in Thai aromatic coconut water

    Directory of Open Access Journals (Sweden)

    Nopporn Jaroonchon

    2015-10-01

    Full Text Available Many researches have reported that nucleic acid in coconut water is in free form and at very low yields which makes it difficult to process in molecular studies. Our research attempted to compare two extraction methods to obtain a higher yield of total RNA in aromatic coconut water and monitor its change at various fruit stages. The first method used ethanol and sodium acetate as reagents; the second method used lithium chloride. We found that extraction using only lithium chloride gave a higher total RNA yield than the method using ethanol to precipitate nucleic acid. In addition, the total RNA from both methods could be used in amplification of betaine aldehyde dehydrogenase2 (Badh2 genes, which is involved in coconut aroma biosynthesis, and could be used to perform further study as we expected. From the molecular study, the nucleic acid found in coconut water increased with fruit age.

  6. A single-step method for rapid extraction of total lipids from green microalgae.

    Directory of Open Access Journals (Sweden)

    Martin Axelsson

    Full Text Available Microalgae produce a wide range of lipid compounds of potential commercial interest. Total lipid extraction performed by conventional extraction methods, relying on the chloroform-methanol solvent system are too laborious and time consuming for screening large numbers of samples. In this study, three previous extraction methods devised by Folch et al. (1957, Bligh and Dyer (1959 and Selstam and Öquist (1985 were compared and a faster single-step procedure was developed for extraction of total lipids from green microalgae. In the single-step procedure, 8 ml of a 2∶1 chloroform-methanol (v/v mixture was added to fresh or frozen microalgal paste or pulverized dry algal biomass contained in a glass centrifuge tube. The biomass was manually suspended by vigorously shaking the tube for a few seconds and 2 ml of a 0.73% NaCl water solution was added. Phase separation was facilitated by 2 min of centrifugation at 350 g and the lower phase was recovered for analysis. An uncharacterized microalgal polyculture and the green microalgae Scenedesmus dimorphus, Selenastrum minutum, and Chlorella protothecoides were subjected to the different extraction methods and various techniques of biomass homogenization. The less labour intensive single-step procedure presented here allowed simultaneous recovery of total lipid extracts from multiple samples of green microalgae with quantitative yields and fatty acid profiles comparable to those of the previous methods. While the single-step procedure is highly correlated in lipid extractability (r² = 0.985 to the previous method of Folch et al. (1957, it allowed at least five times higher sample throughput.

  7. Chiral symmetry breaking parameters from QCD sum rules

    Energy Technology Data Exchange (ETDEWEB)

    Mallik, S [Karlsruhe Univ. (T.H.) (Germany, F.R.). Inst. fuer Theoretische Kernphysik; Bern Univ. (Switzerland). Inst. fuer Theoretische Physik)

    1982-10-04

    We obtain new QCD sum rules by considering vacuum expectation values of two-point functions, taking all the five quark bilinears into account. These sum rules are employed to extract values of different chiral symmetry breaking parameters in QCD theory. We find masses of light quarks, m=1/2msub(u)+msub(d)=8.4+-1.2 MeV, msub(s)=205+-65 MeV. Further, we obtain corrections to certain soft pion (kaon) PCAC relations and the violation of SU(3) flavour symmetry by the non-strange and strange quark-antiquark vacuum condensate.

  8. Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search

    Science.gov (United States)

    Nakamura, Katsuhiko; Hoshina, Akemi

    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.

  9. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  10. Chemical composition and antibacterial activity of Cordia verbenacea extracts obtained by different methods.

    Science.gov (United States)

    Michielin, Eliane M Z; Salvador, Ana A; Riehl, Carlos A S; Smânia, Artur; Smânia, Elza F A; Ferreira, Sandra R S

    2009-12-01

    The present study describes the chemical composition and the antibacterial activity of extracts from Cordia verbenacea DC (Borraginaceae), a traditional medicinal plant that grows widely along the southeastern coast of Brazil. The extracts were obtained using different extraction techniques: high-pressure operations and low-pressure methods. The high-pressure technique was applied to obtain C. verbenacea extracts using pure CO(2) and CO(2) with co-solvent at pressures up to 30MPa and temperatures of 30, 40 and 50 degrees C. Organic solvents such as n-hexane, ethyl acetate, ethanol, acetone and dichloromethane were used to obtain extracts by low-pressure processes. The antibacterial activity of the extracts was also subjected to screening against four strains of bacteria using the agar dilution method. The extraction yields were up to 5.0% w/w and up to 8.6% w/w for supercritical fluid extraction with pure CO(2) and with ethyl acetate as co-solvent, respectively, while the low-pressure extraction indicates yields up to 24.0% w/w in the soxhlet extraction using water and aqueous mixture with 50% ethanol as solvents. The inhibitory activity of the extracts in gram-positive bacteria was significantly higher than in gram-negative. The quantification and the identification of the extracts recovered were accomplished using GC/MS analysis. The most important components identified in the extract were artemetin, beta-sitosterol, alpha-humulene and beta-caryophyllene, among others.

  11. Extraction and Preference Ordering of Multireservoir Water Supply Rules in Dry Years

    Directory of Open Access Journals (Sweden)

    Ling Kang

    2016-01-01

    Full Text Available This paper presents a new methodology of combined use of the nondominated sorting genetic algorithm II (NSGA-II and the approach of successive elimination of alternatives based on order and degree of efficiency (SEABODE in identifying the most preferred multireservoir water supply rules in dry years. First, the suggested operation rules consists of a two-point type time-varying hedging policy for a single reservoir and a simple proportional allocation policy of common water demand between two parallel reservoirs. Then, the NSGA-II is employed to derive enough noninferior operation rules (design alternatives in terms of two conflicting objectives (1 minimizing the total deficit ratio (TDR of all demands of the entire system in operation horizon, and (2 minimizing the maximum deficit ratio (MDR of water supply in a single period. Next, the SEABODE, a multicriteria decision making (MCDM procedure, is applied to further eliminate alternatives based on the concept of efficiency of order k with degree p. In SEABODE, the reservoir performance indices and water shortage indices are selected as evaluation criteria for preference ordering among the design alternatives obtained by NSGA-II. The proposed methodology was tested on a regional water supply system with three reservoirs located in the Jialing River, China, where the results demonstrate its applicability and merits.

  12. A Systematic Synthesis Framework for Extractive Distillation Processes

    DEFF Research Database (Denmark)

    Kossack, S.; Kraemer, K.; Gani, Rafiqul

    2008-01-01

    An effective extractive distillation process depends on the choice of the extractive agent. in this contribution, heuristic rules for entrainer selection and the design of entrainers through computer-aided molecular design are reviewed. The potential of the generated alternatives is then evaluated...

  13. Right ventricular volume estimation with cine MRI; A comparative study between Simpson's rule and a new modified area-length method

    Energy Technology Data Exchange (ETDEWEB)

    Sawachika, Takashi (Yamaguchi Univ., Ube (Japan). School of Medicine)

    1993-04-01

    To quantitate right ventricular (RV) volumes easily using cine MRI, we developed a new method called 'modified area-length method (MOAL method)'. To validate this method, we compared it to the conventional Simpson's rule. Magnetom H15 (Siemens) was used and 6 normal volunteers and 21 patients with various RV sizes were imaged with ECG triggered gradient echo method (FISP, TR 50 ms, TE 12 ms, slice thickness 9 mm). For Simpson's rule transverse images of 12 sequential views which cover whole heart were acquired. For the MOAL method, two orthogonal views were imaged. One was the sagittal view which includes RV outflow tract and the other was the coronal view defined from the sagittal image to cover the whole RV. From these images the area (As, Ac) of RV and the longest distance between RV apex and pulmonary valve (Lmax) were determined. By correlating RV volumes measured by Simpson's rule to As*Ac/Lmax the RV volume could be estimated as follows: V=0.85*As*Ac/Lmax+4.55. Thus the MOAL method demonstrated excellent accuracy to quantitate RV volume and the acquisition time abbreviated to one fifth compared with Simpson's rule. This should be a highly promising method for routine clinical application. (author).

  14. effects of extraction method on the physicochemical and mycological

    African Journals Online (AJOL)

    DR. AMINU

    Canarium Schweinfurthii fruit oil obtained by the improved method of extraction had better quality and stability ... application due to the believe that it enhances the oil flavours, also the fruit ..... starters on coconut oil recovery. Proceeding.

  15. Solid phase extraction method for determination of mitragynine in ...

    African Journals Online (AJOL)

    All rights reserved. ... 1Department of Pharmacology, 2Department of Applied Science, 3Police Forensic Science Center 10, Yala 95000, 4Natural ... Purpose: To develop a solid phase extraction (SPE) method that utilizes reverse-phase high.

  16. Totally optimal decision rules

    KAUST Repository

    Amin, Talha

    2017-11-22

    Optimality of decision rules (patterns) can be measured in many ways. One of these is referred to as length. Length signifies the number of terms in a decision rule and is optimally minimized. Another, coverage represents the width of a rule’s applicability and generality. As such, it is desirable to maximize coverage. A totally optimal decision rule is a decision rule that has the minimum possible length and the maximum possible coverage. This paper presents a method for determining the presence of totally optimal decision rules for “complete” decision tables (representations of total functions in which different variables can have domains of differing values). Depending on the cardinalities of the domains, we can either guarantee for each tuple of values of the function that totally optimal rules exist for each row of the table (as in the case of total Boolean functions where the cardinalities are equal to 2) or, for each row, we can find a tuple of values of the function for which totally optimal rules do not exist for this row.

  17. Totally optimal decision rules

    KAUST Repository

    Amin, Talha M.; Moshkov, Mikhail

    2017-01-01

    Optimality of decision rules (patterns) can be measured in many ways. One of these is referred to as length. Length signifies the number of terms in a decision rule and is optimally minimized. Another, coverage represents the width of a rule’s applicability and generality. As such, it is desirable to maximize coverage. A totally optimal decision rule is a decision rule that has the minimum possible length and the maximum possible coverage. This paper presents a method for determining the presence of totally optimal decision rules for “complete” decision tables (representations of total functions in which different variables can have domains of differing values). Depending on the cardinalities of the domains, we can either guarantee for each tuple of values of the function that totally optimal rules exist for each row of the table (as in the case of total Boolean functions where the cardinalities are equal to 2) or, for each row, we can find a tuple of values of the function for which totally optimal rules do not exist for this row.

  18. ANTIOXIDANT AND ANTIBACTERIAL CAPACITY OF LOLOH SEMBUNG (Blumea balsamifera BASED ON EXTRACTION METHOD

    Directory of Open Access Journals (Sweden)

    IGA. Wita Kusumawati

    2016-12-01

    Full Text Available Loloh sembung (Blumea balsamifera is a traditional herbal drink which of the extraction methods can be done by boiling and brewing. Loloh sembung was prepared from fresh and dried leaves. Loloh sembung extracted by different methods producing phenolic content, tannin content, antioxidant capacity, are different. Dried leaves were extracted by brewing have high content of total phenolic was at 13.15±0.11 mg GAE/g sample, while dried leaves were extracted by boiling have high content of tannin and antioxidant capacity were at 1.65±0.01 mg TAE/g sample and 5.55±0.01 mg GAE/g sample respectiveliy. Both of fresh and dried leaves were extracted by boiling and brewing were not show inhibition against Escherichia coli and Staphylococcus aureus bacteria.

  19. Using Best Practices to Extract, Organize, and Reuse Embedded Decision Support Content Knowledge Rules from Mature Clinical Systems.

    Science.gov (United States)

    DesAutels, Spencer J; Fox, Zachary E; Giuse, Dario A; Williams, Annette M; Kou, Qing-Hua; Weitkamp, Asli; Neal R, Patel; Bettinsoli Giuse, Nunzia

    2016-01-01

    Clinical decision support (CDS) knowledge, embedded over time in mature medical systems, presents an interesting and complex opportunity for information organization, maintenance, and reuse. To have a holistic view of all decision support requires an in-depth understanding of each clinical system as well as expert knowledge of the latest evidence. This approach to clinical decision support presents an opportunity to unify and externalize the knowledge within rules-based decision support. Driven by an institutional need to prioritize decision support content for migration to new clinical systems, the Center for Knowledge Management and Health Information Technology teams applied their unique expertise to extract content from individual systems, organize it through a single extensible schema, and present it for discovery and reuse through a newly created Clinical Support Knowledge Acquisition and Archival Tool (CS-KAAT). CS-KAAT can build and maintain the underlying knowledge infrastructure needed by clinical systems.

  20. Using Best Practices to Extract, Organize, and Reuse Embedded Decision Support Content Knowledge Rules from Mature Clinical Systems

    Science.gov (United States)

    DesAutels, Spencer J.; Fox, Zachary E.; Giuse, Dario A.; Williams, Annette M.; Kou, Qing-hua; Weitkamp, Asli; Neal R, Patel; Bettinsoli Giuse, Nunzia

    2016-01-01

    Clinical decision support (CDS) knowledge, embedded over time in mature medical systems, presents an interesting and complex opportunity for information organization, maintenance, and reuse. To have a holistic view of all decision support requires an in-depth understanding of each clinical system as well as expert knowledge of the latest evidence. This approach to clinical decision support presents an opportunity to unify and externalize the knowledge within rules-based decision support. Driven by an institutional need to prioritize decision support content for migration to new clinical systems, the Center for Knowledge Management and Health Information Technology teams applied their unique expertise to extract content from individual systems, organize it through a single extensible schema, and present it for discovery and reuse through a newly created Clinical Support Knowledge Acquisition and Archival Tool (CS-KAAT). CS-KAAT can build and maintain the underlying knowledge infrastructure needed by clinical systems. PMID:28269846

  1. Evaluating the efficacy of DNA differential extraction methods for sexual assault evidence.

    Science.gov (United States)

    Klein, Sonja B; Buoncristiani, Martin R

    2017-07-01

    Analysis of sexual assault evidence, often a mixture of spermatozoa and victim epithelial cells, represents a significant portion of a forensic DNA laboratory's case load. Successful genotyping of sperm DNA from these mixed cell samples, particularly with low amounts of sperm, depends on maximizing sperm DNA recovery and minimizing non-sperm DNA carryover. For evaluating the efficacy of the differential extraction, we present a method which uses a Separation Potential Ratio (SPRED) to consider both sperm DNA recovery and non-sperm DNA removal as variables for determining separation efficiency. In addition, we describe how the ratio of male-to-female DNA in the sperm fraction may be estimated by using the SPRED of the differential extraction method in conjunction with the estimated ratio of male-to-female DNA initially present on the mixed swab. This approach may be useful for evaluating or modifying differential extraction methods, as we demonstrate by comparing experimental results obtained from the traditional differential extraction and the Erase Sperm Isolation Kit (PTC © ) procedures. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Evaluation of extraction methods for ochratoxin A detection in cocoa beans employing HPLC.

    Science.gov (United States)

    Mishra, Rupesh K; Catanante, Gaëlle; Hayat, Akhtar; Marty, Jean-Louis

    2016-01-01

    Cocoa is an important ingredient for the chocolate industry and for many food products. However, it is prone to contamination by ochratoxin A (OTA), which is highly toxic and potentially carcinogenic to humans. In this work, four different extraction methods were tested and compared based on their recoveries. The best protocol was established which involves an organic solvent-free extraction method for the detection of OTA in cocoa beans using 1% sodium hydrogen carbonate (NaHCO3) in water within 30 min. The extraction method is rapid (as compared with existing methods), simple, reliable and practical to perform without complex experimental set-ups. The cocoa samples were freshly extracted and cleaned-up using immunoaffinity column (IAC) for HPLC analysis using a fluorescence detector. Under the optimised condition, the limit of detection (LOD) and limit of quantification (LOQ) for OTA were 0.62 and 1.25 ng ml(-1) respectively in standard solutions. The method could successfully quantify OTA in naturally contaminated samples. Moreover, good recoveries of OTA were obtained up to 86.5% in artificially spiked cocoa samples, with a maximum relative standard deviation (RSD) of 2.7%. The proposed extraction method could determine OTA at the level 1.5 µg kg(-)(1), which surpassed the standards set by the European Union for cocoa (2 µg kg(-1)). In addition, an efficiency comparison of IAC and molecular imprinted polymer (MIP) column was also performed and evaluated.

  3. Choosing the rules: distinct and overlapping frontoparietal representations of task rules for perceptual decisions.

    Science.gov (United States)

    Zhang, Jiaxiang; Kriegeskorte, Nikolaus; Carlin, Johan D; Rowe, James B

    2013-07-17

    Behavior is governed by rules that associate stimuli with responses and outcomes. Human and monkey studies have shown that rule-specific information is widely represented in the frontoparietal cortex. However, it is not known how establishing a rule under different contexts affects its neural representation. Here, we use event-related functional MRI (fMRI) and multivoxel pattern classification methods to investigate the human brain's mechanisms of establishing and maintaining rules for multiple perceptual decision tasks. Rules were either chosen by participants or specifically instructed to them, and the fMRI activation patterns representing rule-specific information were compared between these contexts. We show that frontoparietal regions differ in the properties of their rule representations during active maintenance before execution. First, rule-specific information maintained in the dorsolateral and medial frontal cortex depends on the context in which it was established (chosen vs specified). Second, rule representations maintained in the ventrolateral frontal and parietal cortex are independent of the context in which they were established. Furthermore, we found that the rule-specific coding maintained in anticipation of stimuli may change with execution of the rule: representations in context-independent regions remain invariant from maintenance to execution stages, whereas rule representations in context-dependent regions do not generalize to execution stage. The identification of distinct frontoparietal systems with context-independent and context-dependent task rule representations, and the distinction between anticipatory and executive rule representations, provide new insights into the functional architecture of goal-directed behavior.

  4. Validation of a One-Step Method for Extracting Fatty Acids from Salmon, Chicken and Beef Samples.

    Science.gov (United States)

    Zhang, Zhichao; Richardson, Christine E; Hennebelle, Marie; Taha, Ameer Y

    2017-10-01

    Fatty acid extraction methods are time-consuming and expensive because they involve multiple steps and copious amounts of extraction solvents. In an effort to streamline the fatty acid extraction process, this study compared the standard Folch lipid extraction method to a one-step method involving a column that selectively elutes the lipid phase. The methods were tested on raw beef, salmon, and chicken. Compared to the standard Folch method, the one-step extraction process generally yielded statistically insignificant differences in chicken and salmon fatty acid concentrations, percent composition and weight percent. Initial testing showed that beef stearic, oleic and total fatty acid concentrations were significantly lower by 9-11% with the one-step method as compared to the Folch method, but retesting on a different batch of samples showed a significant 4-8% increase in several omega-3 and omega-6 fatty acid concentrations with the one-step method relative to the Folch. Overall, the findings reflect the utility of a one-step extraction method for routine and rapid monitoring of fatty acids in chicken and salmon. Inconsistencies in beef concentrations, although minor (within 11%), may be due to matrix effects. A one-step fatty acid extraction method has broad applications for rapidly and routinely monitoring fatty acids in the food supply and formulating controlled dietary interventions. © 2017 Institute of Food Technologists®.

  5. MBA: a literature mining system for extracting biomedical abbreviations.

    Science.gov (United States)

    Xu, Yun; Wang, ZhiHao; Lei, YiMing; Zhao, YuZhong; Xue, Yu

    2009-01-09

    The exploding growth of the biomedical literature presents many challenges for biological researchers. One such challenge is from the use of a great deal of abbreviations. Extracting abbreviations and their definitions accurately is very helpful to biologists and also facilitates biomedical text analysis. Existing approaches fall into four broad categories: rule based, machine learning based, text alignment based and statistically based. State of the art methods either focus exclusively on acronym-type abbreviations, or could not recognize rare abbreviations. We propose a systematic method to extract abbreviations effectively. At first a scoring method is used to classify the abbreviations into acronym-type and non-acronym-type abbreviations, and then their corresponding definitions are identified by two different methods: text alignment algorithm for the former, statistical method for the latter. A literature mining system MBA was constructed to extract both acronym-type and non-acronym-type abbreviations. An abbreviation-tagged literature corpus, called Medstract gold standard corpus, was used to evaluate the system. MBA achieved a recall of 88% at the precision of 91% on the Medstract gold-standard EVALUATION Corpus. We present a new literature mining system MBA for extracting biomedical abbreviations. Our evaluation demonstrates that the MBA system performs better than the others. It can identify the definition of not only acronym-type abbreviations including a little irregular acronym-type abbreviations (e.g., ), but also non-acronym-type abbreviations (e.g., ).

  6. Layout-aware text extraction from full-text PDF of scientific articles

    Directory of Open Access Journals (Sweden)

    Ramakrishnan Cartic

    2012-05-01

    Full Text Available Abstract Background The Portable Document Format (PDF is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the ‘Layout-Aware PDF Text Extraction’ (LA-PDFText system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Results Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1 Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2 Classifying text blocks into rhetorical categories using a rule-based method and (3 Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF

  7. A simple method of genomic DNA extraction suitable for analysis of bulk fungal strains.

    Science.gov (United States)

    Zhang, Y J; Zhang, S; Liu, X Z; Wen, H A; Wang, M

    2010-07-01

    A simple and rapid method (designated thermolysis) for extracting genomic DNA from bulk fungal strains was described. In the thermolysis method, a few mycelia or yeast cells were first rinsed with pure water to remove potential PCR inhibitors and then incubated in a lysis buffer at 85 degrees C to break down cell walls and membranes. This method was used to extract genomic DNA from large numbers of fungal strains (more than 92 species, 35 genera of three phyla) isolated from different sections of natural Ophiocordyceps sinensis specimens. Regions of interest from high as well as single-copy number genes were successfully amplified from the extracted DNA samples. The DNA samples obtained by this method can be stored at -20 degrees C for over 1 year. The method was effective, easy and fast and allowed batch DNA extraction from multiple fungal isolates. Use of the thermolysis method will allow researchers to obtain DNA from fungi quickly for use in molecular assays. This method requires only minute quantities of starting material and is suitable for diverse fungal species.

  8. Solid phase extraction-electrospray ionization mass spectrometric method for the determination of palladium

    International Nuclear Information System (INIS)

    Pranaw Kumar; Telmore, Vijay M.; Jaison, P.G.; Sarkar, Arnab; Alamelu, D.; Aggarwal, S.K.

    2015-01-01

    Platinum group of element (PGEs) are extensively used as a catalyst and anticancer reagent. Due to the soft nature of PGEs, sulphur based donar ligands are used for the separation of these elements. Studies on the formation of different species are helpful for obtaining the ideas about separation of these elements from the complex matrices. Palladium (Pd) is studied as a representative element which is also formed by nuclear fission of fissile nuclides. In view of the relatively small amount of solvent required for separation, solid phase extraction is preferred over most of the separation methods. Solid phase extraction method using DPX as a stationary phase was previously reported for the separation of Pd in SHLLW using benzoylthiourea as a ligand. However, in case of large volume samples manual extraction by DPX is tedious task. In the present studies, the feasibility of extraction using benzoylthiourea on automated solid phase extraction system was carried out for the extraction of Pd

  9. Drug side effect extraction from clinical narratives of psychiatry and psychology patients.

    Science.gov (United States)

    Sohn, Sunghwan; Kocher, Jean-Pierre A; Chute, Christopher G; Savova, Guergana K

    2011-12-01

    To extract physician-asserted drug side effects from electronic medical record clinical narratives. Pattern matching rules were manually developed through examining keywords and expression patterns of side effects to discover an individual side effect and causative drug relationship. A combination of machine learning (C4.5) using side effect keyword features and pattern matching rules was used to extract sentences that contain side effect and causative drug pairs, enabling the system to discover most side effect occurrences. Our system was implemented as a module within the clinical Text Analysis and Knowledge Extraction System. The system was tested in the domain of psychiatry and psychology. The rule-based system extracting side effects and causative drugs produced an F score of 0.80 (0.55 excluding allergy section). The hybrid system identifying side effect sentences had an F score of 0.75 (0.56 excluding allergy section) but covered more side effect and causative drug pairs than individual side effect extraction. The rule-based system was able to identify most side effects expressed by clear indication words. More sophisticated semantic processing is required to handle complex side effect descriptions in the narrative. We demonstrated that our system can be trained to identify sentences with complex side effect descriptions that can be submitted to a human expert for further abstraction. Our system was able to extract most physician-asserted drug side effects. It can be used in either an automated mode for side effect extraction or semi-automated mode to identify side effect sentences that can significantly simplify abstraction by a human expert.

  10. A Generic multi-dimensional feature extraction method using multiobjective genetic programming.

    Science.gov (United States)

    Zhang, Yang; Rockett, Peter I

    2009-01-01

    In this paper, we present a generic feature extraction method for pattern classification using multiobjective genetic programming. This not only evolves the (near-)optimal set of mappings from a pattern space to a multi-dimensional decision space, but also simultaneously optimizes the dimensionality of that decision space. The presented framework evolves vector-to-vector feature extractors that maximize class separability. We demonstrate the efficacy of our approach by making statistically-founded comparisons with a wide variety of established classifier paradigms over a range of datasets and find that for most of the pairwise comparisons, our evolutionary method delivers statistically smaller misclassification errors. At very worst, our method displays no statistical difference in a few pairwise comparisons with established classifier/dataset combinations; crucially, none of the misclassification results produced by our method is worse than any comparator classifier. Although principally focused on feature extraction, feature selection is also performed as an implicit side effect; we show that both feature extraction and selection are important to the success of our technique. The presented method has the practical consequence of obviating the need to exhaustively evaluate a large family of conventional classifiers when faced with a new pattern recognition problem in order to attain a good classification accuracy.

  11. Determination of copper in natural waters and sediments by extraction spectroscopic method

    Digital Repository Service at National Institute of Oceanography (India)

    Sarma, V.V.; Raju, G.R.K.

    A sensitive extraction spectrophotometric method has been developed based on the formation and extraction of Cu(II)-neocuproine-rosenbengal into chloroform. The molar extinction coefficient of the system is 55.500 and Beer's law is obeyed up to 15...

  12. Discovering Sentinel Rules for Business Intelligence

    Science.gov (United States)

    Middelfart, Morten; Pedersen, Torben Bach

    This paper proposes the concept of sentinel rules for multi-dimensional data that warns users when measure data concerning the external environment changes. For instance, a surge in negative blogging about a company could trigger a sentinel rule warning that revenue will decrease within two months, so a new course of action can be taken. Hereby, we expand the window of opportunity for organizations and facilitate successful navigation even though the world behaves chaotically. Since sentinel rules are at the schema level as opposed to the data level, and operate on data changes as opposed to absolute data values, we are able to discover strong and useful sentinel rules that would otherwise be hidden when using sequential pattern mining or correlation techniques. We present a method for sentinel rule discovery and an implementation of this method that scales linearly on large data volumes.

  13. Effects of Different Extraction Methods on the Extraction Rates of Five Chemical Ingredients of Swertia mussotii Franch by UPLC-ESI-MS/MS

    Science.gov (United States)

    Xiong, Yaokun; Zhou, Lifen; Zhao, Yonghong; Liu, Yun; Liu, Xia; Zhu, Genhua; Yan, Zhihong; Liu, Zhiyong

    2018-01-01

    Objective: To compare the effects of three extraction methods (ultrasound, reflux and percolation) on the contents of gentiopicroside, mangiferin, swertiamain, sweroside and oleanolic acid in Swertia mussotii Franch. Method: The contents of five components were determined by UPLC-ESI/MS. In the solvent system, eluent A was 0.05% (v: v) formic acid with 1mM/L ammonium acetate aqueous solution and eluent B was acetonitrile. Chromatographic separations were achieved using an Agilent EC-C18 column (4.6×100 mm, 2.7 um) at 30 °C. The flow rate was set at 0.8mL/min. The compound ionization was adopted at negative ionization mode by electro spray ionization (ESI). The quantification was performed in multiple reaction monitoring (MRM). Results: The linear ranges of swertiamarin, gentiopicroside, mangiferin, sweroside and oleanolic acid are 80∼7450 ng/mL, 103∼6600 ng/mL, 100∼8000 ng/mL, 130∼8450 ng/mL, 100∼7000 ng/mL, respectively. As a result, the content of oleanolic acid was the highest extracted by ultrasonic extraction and the content of mangiferin was the highest extracted by reflux extraction. For percolation extraction, the contents of five components were between ultrasound and reflux extraction. Conclusion: For five components, there are significant differences between the three different extraction methods. The results could provide a reference for the quality control of Swertia mussotii Franch and the research and development of new drugs.

  14. Feature extraction from mammographic images using fast marching methods

    International Nuclear Information System (INIS)

    Bottigli, U.; Golosio, B.

    2002-01-01

    Features extraction from medical images represents a fundamental step for shape recognition and diagnostic support. The present work faces the problem of the detection of large features, such as massive lesions and organ contours, from mammographic images. The regions of interest are often characterized by an average grayness intensity that is different from the surrounding. In most cases, however, the desired features cannot be extracted by simple gray level thresholding, because of image noise and non-uniform density of the surrounding tissue. In this work, edge detection is achieved through the fast marching method (Level Set Methods and Fast Marching Methods, Cambridge University Press, Cambridge, 1999), which is based on the theory of interface evolution. Starting from a seed point in the shape of interest, a front is generated which evolves according to an appropriate speed function. Such function is expressed in terms of geometric properties of the evolving interface and of image properties, and should become zero when the front reaches the desired boundary. Some examples of application of such method to mammographic images from the CALMA database (Nucl. Instr. and Meth. A 460 (2001) 107) are presented here and discussed

  15. Analytical Methods Development in Support of the Caustic Side Solvent Extraction System

    International Nuclear Information System (INIS)

    Maskarinec, M.P.

    2001-01-01

    The goal of the project reported herein was to develop and apply methods for the analysis of the major components of the solvent system used in the Caustic-Side Solvent Extraction Process (CSSX). These include the calix(4)arene, the modifier, 1-(2,2,3,3- tetrafluoropropoxy)-3-(4-sec-butylphenoxy)-2-propanol and tri-n-octylamine. In addition, it was an objective to develop methods that would allow visualization of other components under process conditions. These analyses would include quantitative laboratory methods for each of the components, quantitative analysis of expected breakdown products (4-see-butylphenol and di-n-octylamine), and qualitative investigations of possible additional breakdown products under a variety of process extremes. These methods would also provide a framework for process analysis should a pilot facility be developed. Two methods were implemented for sample preparation of aqueous phases. The first involves solid-phase extraction and produces quantitative recovery of the solvent components and degradation products from the various aqueous streams. This method can be automated and is suitable for use in radiation shielded facilities. The second is a variation of an established EPA liquid-liquid extraction procedure. This method is also quantitative and results in a final extract amenable to virtually any instrumental analysis. Two HPLC methods were developed for quantitative analysis. The first is a reverse-phase system with variable wavelength W detection. This method is excellent from a quantitative point of view. The second method is a size-exclusion method coupled with dual UV and evaporative light scattering detectors. This method is much faster than the reverse-phase method and allows for qualitative analysis of other components of the waste. For tri-n-octylamine and other degradation products, a GC method was developed and subsequently extended to GUMS. All methods have precision better than 5%. The combination of these methods

  16. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  17. Comparison of Two Simplification Methods for Shoreline Extraction from Digital Orthophoto Images

    Science.gov (United States)

    Bayram, B.; Sen, A.; Selbesoglu, M. O.; Vārna, I.; Petersons, P.; Aykut, N. O.; Seker, D. Z.

    2017-11-01

    The coastal ecosystems are very sensitive to external influences. Coastal resources such as sand dunes, coral reefs and mangroves has vital importance to prevent coastal erosion. Human based effects also threats the coastal areas. Therefore, the change of coastal areas should be monitored. Up-to-date, accurate shoreline information is indispensable for coastal managers and decision makers. Remote sensing and image processing techniques give a big opportunity to obtain reliable shoreline information. In the presented study, NIR bands of seven 1:5000 scaled digital orthophoto images of Riga Bay-Latvia have been used. The Object-oriented Simple Linear Clustering method has been utilized to extract shoreline of Riga Bay. Bend and Douglas-Peucker methods have been used to simplify the extracted shoreline to test the effect of both methods. Photogrammetrically digitized shoreline has been taken as reference data to compare obtained results. The accuracy assessment has been realised by Digital Shoreline Analysis tool. As a result, the achieved shoreline by the Bend method has been found closer to the extracted shoreline with Simple Linear Clustering method.

  18. COMPARISON OF TWO SIMPLIFICATION METHODS FOR SHORELINE EXTRACTION FROM DIGITAL ORTHOPHOTO IMAGES

    Directory of Open Access Journals (Sweden)

    B. Bayram

    2017-11-01

    Full Text Available The coastal ecosystems are very sensitive to external influences. Coastal resources such as sand dunes, coral reefs and mangroves has vital importance to prevent coastal erosion. Human based effects also threats the coastal areas. Therefore, the change of coastal areas should be monitored. Up-to-date, accurate shoreline information is indispensable for coastal managers and decision makers. Remote sensing and image processing techniques give a big opportunity to obtain reliable shoreline information. In the presented study, NIR bands of seven 1:5000 scaled digital orthophoto images of Riga Bay-Latvia have been used. The Object-oriented Simple Linear Clustering method has been utilized to extract shoreline of Riga Bay. Bend and Douglas-Peucker methods have been used to simplify the extracted shoreline to test the effect of both methods. Photogrammetrically digitized shoreline has been taken as reference data to compare obtained results. The accuracy assessment has been realised by Digital Shoreline Analysis tool. As a result, the achieved shoreline by the Bend method has been found closer to the extracted shoreline with Simple Linear Clustering method.

  19. A rule-based fault detection method for air handling units

    Energy Technology Data Exchange (ETDEWEB)

    Schein, J.; Bushby, S. T.; Castro, N. S. [National Institute of Standards and Technology, Gaithersburg, MD (United States); House, J. M. [Iowa Energy Center, Ankeny, IA (United States)

    2006-07-01

    Air handling unit performance assessment rules (APAR) is a fault detection tool that uses a set of expert rules derived from mass and energy balances to detect faults in air handling units (AHUs). Control signals are used to determine the mode of operation of the AHU. A subset of the expert rules which correspond to that mode of operation are then evaluated to determine whether a fault exists. APAR is computationally simple enough that it can be embedded in commercial building automation and control systems and relies only upon the sensor data and control signals that are commonly available in these systems. APAR was tested using data sets collected from a 'hardware-in-the-loop' emulator and from several field sites. APAR was also embedded in commercial AHU controllers and tested in the emulator. (author)

  20. Spatio-Temporal Rule Mining

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach

    2005-01-01

    Recent advances in communication and information technology, such as the increasing accuracy of GPS technology and the miniaturization of wireless communication devices pave the road for Location-Based Services (LBS). To achieve high quality for such services, spatio-temporal data mining techniques...... are needed. In this paper, we describe experiences with spatio-temporal rule mining in a Danish data mining company. First, a number of real world spatio-temporal data sets are described, leading to a taxonomy of spatio-temporal data. Second, the paper describes a general methodology that transforms...... the spatio-temporal rule mining task to the traditional market basket analysis task and applies it to the described data sets, enabling traditional association rule mining methods to discover spatio-temporal rules for LBS. Finally, unique issues in spatio-temporal rule mining are identified and discussed....

  1. Rules and routines in organizations and the management of safety rules

    Energy Technology Data Exchange (ETDEWEB)

    Weichbrodt, J. Ch.

    2013-07-01

    participation in rule creation. Paper 2 makes use of extensive empirical data collected at three different fields of work within SBB (signaling, shunting, and construction and maintenance). Using both interviews and observation methods, four cases of contested safety rules (i.e., rules that are often bent or broken) are analyzed in detail. For each case, the different aspects of the rule and the routine are disentangled and put in relation to one another. In this manner, the precise way in which rules influence routines (as well as the limits of this influence) is uncovered. Additionally, different ways of sensemaking of rules in the different fields of work are identified and put in relation to the cases of contested rules. Finally, in paper 3, most of the research covered so far is built upon in order to address the question of what should be done to adequately manage safety rules in high-risk organizations. Drawing from organization theory, safety rules are conceptualized as instruments for organizational control, as coordination mechanisms, and as a codified forms of organizational knowledge. With these three functions in mind, four common challenges with safety rules are outlined, as well as four typical measures of good rules management. The relationship between these measures and the challenges and their implication for rules as control, coordination and knowledge are discussed. (author)

  2. Rules and routines in organizations and the management of safety rules

    International Nuclear Information System (INIS)

    Weichbrodt, J. Ch.

    2013-01-01

    participation in rule creation. Paper 2 makes use of extensive empirical data collected at three different fields of work within SBB (signaling, shunting, and construction and maintenance). Using both interviews and observation methods, four cases of contested safety rules (i.e., rules that are often bent or broken) are analyzed in detail. For each case, the different aspects of the rule and the routine are disentangled and put in relation to one another. In this manner, the precise way in which rules influence routines (as well as the limits of this influence) is uncovered. Additionally, different ways of sensemaking of rules in the different fields of work are identified and put in relation to the cases of contested rules. Finally, in paper 3, most of the research covered so far is built upon in order to address the question of what should be done to adequately manage safety rules in high-risk organizations. Drawing from organization theory, safety rules are conceptualized as instruments for organizational control, as coordination mechanisms, and as a codified forms of organizational knowledge. With these three functions in mind, four common challenges with safety rules are outlined, as well as four typical measures of good rules management. The relationship between these measures and the challenges and their implication for rules as control, coordination and knowledge are discussed. (author)

  3. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  4. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  5. Quality and characteristics of fermented ginseng seed oil based on bacterial strain and extraction method

    Directory of Open Access Journals (Sweden)

    Myung-Hee Lee

    2017-07-01

    Results and Conclusion: The color of the fermented ginseng seed oil did not differ greatly according to the fermentation or extraction method. The highest phenolic compound content recovered with the use of supercritical fluid extraction combined with fermentation using the Bacillus subtilis Korea Food Research Institute (KFRI 1127 strain. The fatty acid composition did not differ greatly according to fermentation strain and extraction method. The phytosterol content of ginseng seed oil fermented with Bacillus subtilis KFRI 1127 and extracted using the supercritical fluid method was highest at 983.58 mg/100 g. Therefore, our results suggested that the ginseng seed oil fermented with Bacillus subtilis KFRI 1127 and extracted using the supercritical fluid method can yield a higher content of bioactive ingredients, such as phenolics, and phytosterols, without impacting the color or fatty acid composition of the product.

  6. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa 1

    OpenAIRE

    Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.

    2017-01-01

    Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)?based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer?s protocol. In parallel, CTAB-based extractions were done using the a...

  7. Study the Three Extraction Methods for HBV DNA to Use in PCR

    Directory of Open Access Journals (Sweden)

    N. Sheikh

    2004-07-01

    Full Text Available Diagnosis of Hepatitis B is important because of the its high prevalence. Recently PCR method , has found greater interest among different diagnostic methods. Several reports emphasis on some false negative results in those laboratories using PCR. The aim of this study was to compare three different procedures for HBV DNA extraction. A total 30 serum samples received from Shariati hospital. Sera was taken from patients having chronic Hepatitis with HBs antigen positive and HBe antigen negative. The sensitivity of guanidium hydrochloride method for extracting the HBV DNA from serum were evaluated and compared with phenol–chloroform and boiling methods. Diagnostic PCR kit was obtained from Cynagene contained taq polymerase, reaction mixture, dNTP, and buffer for reaction. A 353 bp product were amplified by amplification program provided in used PCR protocol. The comparison of results indicated that procedure was successful for amplification of the designed products from Hepatitis B in sera. Number of positive results were 16,19,23 and number of negative result were 14,11,7 for the boiling, phenol-chloroform and guanidium-hydrochloride extraction methods respectively.PCR method is the fastest diagnosis method and the most accurate procedure to identify Hepatitis B. Guanidium hydrochloride method was the most successful procedure studied in this survey for viruses.

  8. Efficiency of solvent extraction methods for the determination of methyl mercury in forest soils

    Energy Technology Data Exchange (ETDEWEB)

    Qian, J. [Department of Forest Ecology, Swedish University of Agricultural Sciences, Umeaa (Sweden); Dept. of Analytical Chemistry, Umeaa Univ. (Sweden); Skyllberg, U. [Department of Forest Ecology, Swedish University of Agricultural Sciences, Umeaa (Sweden); Tu, Q.; Frech, W. [Dept. of Analytical Chemistry, Umeaa Univ. (Sweden); Bleam, W.F. [Dept. of Soil Science, University of Wisconsin, Madison, WI (United States)

    2000-07-01

    Methyl mercury was determined by gas chromatography, microwave induced plasma, atomic emission spectrometry (GC-MIP-AES) using two different methods. One was based on extraction of mercury species into toluene, pre-concentration by evaporation and butylation of methyl mercury with a Grignard reagent followed by determination. With the other, methyl mercury was extracted into dichloromethane and back extracted into water followed by in situ ethylation, collection of ethylated mercury species on Tenax and determination. The accuracy of the entire procedure based on butylation was validated for the individual steps involved in the method. Methyl mercury added to various types of soil samples showed an overall average recovery of 87.5%. Reduced recovery was only caused by losses of methyl mercury during extraction into toluene and during pre-concentration by evaporation. The extraction of methyl mercury added to the soil was therefore quantitative. Since it is not possible to directly determine the extraction efficiency of incipient methyl mercury, the extraction efficiency of total mercury with an acidified solution containing CuSO{sub 4} and KBr was compared with high-pressure microwave acid digestion. The solvent extraction efficiency was 93%. For the IAEA 356 sediment certified reference material, mercury was less efficiently extracted and determined methyl mercury concentrations were below the certified value. Incomplete extraction could be explained by the presence of a large part of inorganic sulfides, as determined by x-ray absorption near-edge structure spectroscopy (XANES). Analyses of sediment reference material CRM 580 gave results in agreement with the certified value. The butylation method gave a detection limit for methyl mercury of 0.1 ng g{sup -1}, calculated as three times the standard deviation for repeated analysis of soil samples. Lower values were obtained with the ethylation method. The precision, expressed as RSD for concentrations 20 times

  9. KEBERADAAN KONSEP RULE BY LAW (NEGARA BERDASARKAN HUKUM DIDALAM TEORI NEGARA HUKUM THE RULE OF LAW

    Directory of Open Access Journals (Sweden)

    Made Hendra Wijaya

    2013-11-01

    Full Text Available This research titled, the existence of the concept of rule by law (state law within thestate theories of law the rule of law, which is where the first problem: How can theadvantages of Rule by Law in the theory of law Rule of Law?, How is the dis advantages of aconcept of Rule by law in the theory of law Rule of Law.This research method using the method of normative, legal research that examines thewritten laws of the various aspects, ie aspects of the theory, history, philosophy, comparative,structure and composition, scope, and content, consistent, overview, and chapter by chapter,formality, and the binding force of a law, and the legal language used, but did not examine orimlementasi applied aspects. By using this approach of Historical analysis and approach oflegal conceptual analysis.In this research have found that the advantages of the concept of Rule by Law lies in theproviding of certainty, can also be social control for the community, thus ensuring all citizensin good order at all reciprocal relationships within the community. And Disadvantages of theconcept of Rule by Law if the Law which legalized state action is not supported by democracyand human rights, and the principles of justice, there will be a denial of human rights,widespread poverty, and racial segregation, and if the law is only utilized out by theauthorities as a means to legalize all forms of actions that violate human can inflicttotalitarian nature of the ruling

  10. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    Directory of Open Access Journals (Sweden)

    Ujjwal Maulik

    Full Text Available Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution. The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post

  11. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    Science.gov (United States)

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data

  12. Research on Weak Fault Extraction Method for Alleviating the Mode Mixing of LMD

    Directory of Open Access Journals (Sweden)

    Lin Zhang

    2018-05-01

    Full Text Available Compared with the strong background noise, the energy entropy of early fault signals of bearings are weak under actual working conditions. Therefore, extracting the bearings’ early fault features has always been a major difficulty in fault diagnosis of rotating machinery. Based on the above problems, the masking method is introduced into the Local Mean Decomposition (LMD decomposition process, and a weak fault extraction method based on LMD and mask signal (MS is proposed. Due to the mode mixing of the product function (PF components decomposed by LMD in the noisy background, it is difficult to distinguish the authenticity of the fault frequency. Therefore, the MS method is introduced to deal with the PF components that are decomposed by the LMD and have strong correlation with the original signal, so as to suppress the modal aliasing phenomenon and extract the fault frequencies. In this paper, the actual fault signal of the rolling bearing is analyzed. By combining the MS method with the LMD method, the fault signal mixed with the noise is processed. The kurtosis value at the fault frequency is increased by eight-fold, and the signal-to-noise ratio (SNR is increased by 19.1%. The fault signal is successfully extracted by the proposed composite method.

  13. AUTOMATIC EXTRACTION AND TOPOLOGY RECONSTRUCTION OF URBAN VIADUCTS FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2015-08-01

    Full Text Available Urban viaducts are important infrastructures for the transportation system of a city. In this paper, an original method is proposed to automatically extract urban viaducts and reconstruct topology of the viaduct network just with airborne LiDAR point cloud data. It will greatly simplify the effort-taking procedure of viaducts extraction and reconstruction. In our method, the point cloud first is filtered to divide all the points into ground points and none-ground points. Region growth algorithm is adopted to find the viaduct points from the none-ground points by the features generated from its general prescriptive designation rules. Then, the viaduct points are projected into 2D images to extract the centerline of every viaduct and generate cubic functions to represent passages of viaducts by least square fitting, with which the topology of the viaduct network can be rebuilt by combining the height information. Finally, a topological graph of the viaducts network is produced. The full-automatic method can potentially benefit the application of urban navigation and city model reconstruction.

  14. Decision Mining Revisited – Discovering Overlapping Rules

    NARCIS (Netherlands)

    Mannhardt, F.; de Leoni, M.; Reijers, H.A.; van der Aalst, W.M.P.

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  15. A New Method of Chinese Address Extraction Based on Address Tree Model

    Directory of Open Access Journals (Sweden)

    KANG Mengjun

    2015-01-01

    Full Text Available Address is a spatial location encoding method of individual geographical area. In China, address planning is relatively backward due to the rapid development of the city, resulting in the presence of large number of non-standard address. The space constrain relationship of standard address model is analyzed in this paper and a new method of standard address extraction based on the tree model is proposed, which regards topological relationship as consistent criteria of space constraints. With this method, standard address can be extracted and errors can be excluded from non-standard address. Results indicate that higher math rate can be obtained with this method.

  16. Effects of extraction methods of phenolic compounds from Xanthium strumarium L. and their antioxidant activity

    Directory of Open Access Journals (Sweden)

    R. Scherer

    2014-03-01

    Full Text Available The effect of extraction methods and solvents on overall yield, total phenolic content, antioxidant activity, and the composition of the phenolic compounds in Xanthium strumarium extracts were studied. The antioxidant activity was determined by using 2,2-diphenyl-1-picrylhydrazyl radical (DPPH, and the composition of the phenolic compounds was determined by HPLC-DAD and LC/MS. All results were affected by the extraction method, especially by the solvent used, and the best results were obtained with the methanol extract. The methanolic and ethanolic extracts exhibited strong antioxidant activity, and the chlorogenic and ferulic acids were the most abundant phenolic compounds in the extracts.

  17. DNA extraction methods for panbacterial and panfungal PCR detection in intraocular fluids.

    Science.gov (United States)

    Mazoteras, Paloma; Bispo, Paulo José Martins; Höfling-Lima, Ana Luisa; Casaroli-Marano, Ricardo P

    2015-07-01

    Three different methods of DNA extraction from intraocular fluids were compared with subsequent detection for bacterial and fungal DNA by universal PCR amplification. Three DNA extraction methods, from aqueous and vitreous humors, were evaluated to compare their relative efficiency. Bacterial (Gram positive and negative) and fungal strains were used in this study: Escherichia coli, Staphylococcus epidermidis and Candida albicans. The quality, quantification, and detection limit for DNA extraction and PCR amplification were analyzed. Validation procedures for 13 aqueous humor and 14 vitreous samples, from 20 patients with clinically suspected endophthalmitis were carried out. The column-based extraction method was the most time-effective, achieving DNA detection limits ≥10(2) and 10(3 )CFU/100 µL for bacteria and fungi, respectively. PCR amplification detected 100 fg, 1 pg and 10 pg of genomic DNA of E. coli, S. epidermidis and C. albicans respectively. PCR detected 90.0% of the causative agents from 27 intraocular samples collected from 20 patients with clinically suspected endophthalmitis, while standard microbiological techniques could detect only 60.0%. The most frequently found organisms were Streptococcus spp. in 38.9% (n = 7) of patients and Staphylococcus spp. found in 22.2% (n = 4). The column-based extraction method for very small inocula in small volume samples (50-100 µL) of aqueous and/or vitreous humors allowed PCR amplification in all samples with sufficient quality for subsequent sequencing and identification of the microorganism in the majority of them.

  18. Ions extraction and collection using the RF resonance method and taking into consideration the sputtering loss

    International Nuclear Information System (INIS)

    Xie Guofeng; Wang Dewu; Ying Chuntong

    2005-01-01

    One-dimensional ions extraction and collection using the RF resonance method is studied by PIC-MCC simulation. The energy and angle distribution of extracted ions is recorded and the sputtering loss is calculated. The results show that compared with parallel electrode method, RF resonance method has advantages such as shorter extraction time, lower collision loss and sputtering loss and higher collection ratio; the extraction time and collision loss are decreased with increasing extraction voltage, but the sputtering loss increases and collection ratio decreases; collision loss is decreased with increasing magnetic field, but the sputtering loss increases and collection ratio decreases. (authors)

  19. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa.

    Science.gov (United States)

    Siegel, Chloe S; Stevenson, Florence O; Zimmer, Elizabeth A

    2017-02-01

    An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)-based extraction methods from silica-dried samples. DNA was extracted using FTA cards according to the manufacturer's protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation.

  20. Study on Angelica and its different extracts by Fourier transform infrared spectroscopy and two-dimensional correlation IR spectroscopy

    Science.gov (United States)

    Liu, Hong-xia; Sun, Su-qin; Lv, Guang-hua; Chan, Kelvin K. C.

    2006-05-01

    In order to develop a rapid and effective analysis method for studying integrally the main constituents in the medicinal materials and their extracts, discriminating the extracts from different extraction process, comparing the categories of chemical constituents in the different extracts and monitoring the qualities of medicinal materials, we applied Fourier transform infrared spectroscopy (FT-IR) associated with second derivative infrared spectroscopy and two-dimensional correlation infrared spectroscopy (2D-IR) to study the main constituents in traditional Chinese medicine Angelica and its different extracts (extracted by petroleum ether, ethanol and water in turn). The findings indicated that FT-IR spectrum can provide many holistic variation rules of chemical constituents. Use of the macroscopical fingerprint characters of FT-IR and 2D-IR spectrum can not only identify the main chemical constituents in medicinal materials and their different extracts, but also compare the components differences among the similar samples. This analytical method is highly rapid, effective, visual and accurate for pharmaceutical research.

  1. A method for real-time implementation of HOG feature extraction

    Science.gov (United States)

    Luo, Hai-bo; Yu, Xin-rong; Liu, Hong-mei; Ding, Qing-hai

    2011-08-01

    Histogram of oriented gradient (HOG) is an efficient feature extraction scheme, and HOG descriptors are feature descriptors which is widely used in computer vision and image processing for the purpose of biometrics, target tracking, automatic target detection(ATD) and automatic target recognition(ATR) etc. However, computation of HOG feature extraction is unsuitable for hardware implementation since it includes complicated operations. In this paper, the optimal design method and theory frame for real-time HOG feature extraction based on FPGA were proposed. The main principle is as follows: firstly, the parallel gradient computing unit circuit based on parallel pipeline structure was designed. Secondly, the calculation of arctangent and square root operation was simplified. Finally, a histogram generator based on parallel pipeline structure was designed to calculate the histogram of each sub-region. Experimental results showed that the HOG extraction can be implemented in a pixel period by these computing units.

  2. A comparative study of Averrhoabilimbi extraction method

    Science.gov (United States)

    Zulhaimi, H. I.; Rosli, I. R.; Kasim, K. F.; Akmal, H. Muhammad; Nuradibah, M. A.; Sam, S. T.

    2017-09-01

    In recent year, bioactive compound in plant has become a limelight in the food and pharmaceutical market, leading to research interest to implement effective technologies for extracting bioactive substance. Therefore, this study is focusing on extraction of Averrhoabilimbi by different extraction technique namely, maceration and ultrasound-assisted extraction. Fewplant partsof Averrhoabilimbiweretaken as extraction samples which are fruits, leaves and twig. Different solvents such as methanol, ethanol and distilled water were utilized in the process. Fruit extractsresult in highest extraction yield compared to other plant parts. Ethanol and distilled water have significant role compared to methanol in all parts and both extraction technique. The result also shows that ultrasound-assisted extraction gave comparable result with maceration. Besides, the shorter period on extraction process gives useful in term of implementation to industries.

  3. Comparison of mobility extraction methods based on field-effect measurements for graphene

    Directory of Open Access Journals (Sweden)

    Hua Zhong

    2015-05-01

    Full Text Available Carrier mobility extraction methods for graphene based on field-effect measurements are explored and compared according to theoretical analysis and experimental results. A group of graphene devices with different channel lengths were fabricated and measured, and carrier mobility is extracted from those electrical transfer curves using three different methods. Accuracy and applicability of those methods were compared. Transfer length method (TLM can obtain accurate density dependent mobility and contact resistance at relative high carrier density based on data from a group of devices, and then can act as a standard method to verify other methods. As two of the most popular methods, direct transconductance method (DTM and fitting method (FTM can extract mobility easily based on transfer curve of a sole graphene device. DTM offers an underestimated mobility at any carrier density owing to the neglect of contact resistances, and the accuracy can be improved through fabricating field-effect transistors with long channel and good contacts. FTM assumes a constant mobility independent on carrier density, and then can obtain mobility, contact resistance and residual density stimulations through fitting a transfer curve. However, FTM tends to obtain a mobility value near Dirac point and then overestimates carrier mobility of graphene. Comparing with the DTM and FTM, TLM could offer a much more accurate and carrier density dependent mobility, that reflects the complete properties of graphene carrier mobility.

  4. Validation of an HPLC method for quantification of total quercetin in Calendula officinalis extracts

    International Nuclear Information System (INIS)

    Muñoz Muñoz, John Alexander; Morgan Machado, Jorge Enrique; Trujillo González, Mary

    2015-01-01

    Introduction: calendula officinalis extracts are used as natural raw material in a wide range of pharmaceutical and cosmetic preparations; however, there are no official methods for quality control of these extracts. Objective: to validate an HPLC-based analytical method for quantification total quercetin in glycolic and hydroalcoholic extracts of Calendula officinalis. Methods: to quantify total quercetin content in the matrices, it was necessary to hydrolyze flavonoid glycosides under optimal conditions. The chromatographic separation was performed on a C-18 SiliaChrom 4.6x150 mm 5 µm column, adapted to a SiliaChrom 5 um C-18 4.6x10 mm precolumn, with UV detection at 370 nm. The gradient elution was performed with a mobile phase consisting of methanol (MeOH) and phosphoric acid (H 3 PO 4 ) (0.08 % w/v). The quantification was performed through the external standard method and comparison with quercetin reference standard. Results: the studied method selectivity against extract components and degradation products under acid/basic hydrolysis, oxidation and light exposure conditions showed no signals that interfere with the quercetin quantification. It was statistically proved that the method is linear from 1.0 to 5.0 mg/mL. Intermediate precision expressed as a variation coefficient was 1.8 and 1.74 % and the recovery percentage was 102.15 and 101.32 %, for glycolic and hydroalcoholic extracts, respectively. Conclusions: the suggested methodology meets the quality parameters required for quantifying total quercetin, which makes it a useful tool for quality control of C. officinalis extracts. (author)

  5. Optimization of Ultrasonic-Assisted Extraction of Flavonoid Compounds and Antioxidants from Alfalfa Using Response Surface Method.

    Science.gov (United States)

    Jing, Chang-Liang; Dong, Xiao-Fang; Tong, Jian-Ming

    2015-08-26

    Ultrasonic-assisted extraction (UAE) was used to extract flavonoid-enriched antioxidants from alfalfa aerial part. Response surface methodology (RSM), based on a four-factor, five-level central composite design (CCD), was employed to obtain the optimal extraction parameters, in which the flavonoid content was maximum and the antioxidant activity of the extracts was strongest. Radical scavenging capacity of the extracts, which represents the amounts of antioxidants in alfalfa, was determined by using 2,2'-azino-bis (3-ethylbenzothiazoline-6-sulphonicacid) (ABTS) and 2,2'-diphenyl-1-picrylhydrazyl (DPPH) methods. The results showed good fit with the proposed models for the total flavonoid extraction (R² = 0.9849), for the antioxidant extraction assayed by ABTS method (R² = 0.9764), and by DPPH method (R² = 0.9806). Optimized extraction conditions for total flavonoids was a ratio of liquid to solid of 57.16 mL/g, 62.33 °C, 57.08 min, and 52.14% ethanol. The optimal extraction parameters of extracts for the highest antioxidant activity by DPPH method was a ratio of liquid to solid 60.3 mL/g, 54.56 °C, 45.59 min, and 46.67% ethanol, and by ABTS assay was a ratio of liquid to solid 47.29 mL/g, 63.73 °C, 51.62 min, and 60% ethanol concentration. Our work offers optimal extraction conditions for total flavonoids and antioxidants from alfalfa.

  6. Evaluation of DNA extraction methods for the detection of Cytomegalovirus in dried blood spots

    Science.gov (United States)

    Koontz, D.; Baecher, K.; Amin, M.; Nikolova, S.; Gallagher, M.; Dollard, S.

    2015-01-01

    Background Dried blood spots (DBS) are collected universally from newborns and may be valuable for the diagnosis of congenital Cytomegalovirus (CMV) infection. The reported analytical sensitivity for DBS testing compared to urine or saliva varies greatly across CMV studies. The purpose of this study was to directly compare the performance of various DNA extraction methods for identification of CMV in DBS including those used most often in CMV studies. Study design Whatman® Grade 903 filter paper cards were spotted with blood samples from 25 organ transplant recipients who had confirmed CMV viremia. Six DNA extraction methods were compared for relative yield of viral and cellular DNA: 2 manual solution-based methods (Gentra Puregene, thermal shock), 2 manual silica column-based methods (QIAamp DNA Mini, QIAamp DNA Investigator), and 2 automated methods (M48 MagAttract Mini, QIAcube Investigator). DBS extractions were performed in triplicate followed by real-time quantitative PCR (qPCR). Results For extraction of both viral and cellular DNA, two methods (QIAamp DNA Investigator and thermal shock) consistently gave the highest yields, and two methods (M48 MagAttract Mini and QIAamp DNA Mini) consistently gave the lowest yields. There was an average 3-fold difference in DNA yield between the highest and lowest yield methods. Conclusion The choice of DNA extraction method is a major factor in the ability to detect low levels of CMV in DBS and can largely account for the wide range of DBS sensitivities reported in studies to date. PMID:25866346

  7. Evolving temporal association rules with genetic algorithms

    OpenAIRE

    Matthews, Stephen G.; Gongora, Mario A.; Hopgood, Adrian A.

    2010-01-01

    A novel framework for mining temporal association rules by discovering itemsets with a genetic algorithm is introduced. Metaheuristics have been applied to association rule mining, we show the efficacy of extending this to another variant - temporal association rule mining. Our framework is an enhancement to existing temporal association rule mining methods as it employs a genetic algorithm to simultaneously search the rule space and temporal space. A methodology for validating the ability of...

  8. RNA preservation agents and nucleic acid extraction method bias perceived bacterial community composition.

    Directory of Open Access Journals (Sweden)

    Ann McCarthy

    Full Text Available Bias is a pervasive problem when characterizing microbial communities. An important source is the difference in lysis efficiencies of different populations, which vary depending on the extraction protocol used. To avoid such biases impacting comparisons between gene and transcript abundances in the environment, the use of one protocol that simultaneously extracts both types of nucleic acids from microbial community samples has gained popularity. However, knowledge regarding tradeoffs to combined nucleic acid extraction protocols is limited, particularly regarding yield and biases in the observed community composition. Here, we evaluated a commercially available protocol for simultaneous extraction of DNA and RNA, which we adapted for freshwater microbial community samples that were collected on filters. DNA and RNA yields were comparable to other commonly used, but independent DNA and RNA extraction protocols. RNA protection agents benefited RNA quality, but decreased DNA yields significantly. Choice of extraction protocol influenced the perceived bacterial community composition, with strong method-dependent biases observed for specific phyla such as the Verrucomicrobia. The combined DNA/RNA extraction protocol detected significantly higher levels of Verrucomicrobia than the other protocols, and those higher numbers were confirmed by microscopic analysis. Use of RNA protection agents as well as independent sequencing runs caused a significant shift in community composition as well, albeit smaller than the shift caused by using different extraction protocols. Despite methodological biases, sample origin was the strongest determinant of community composition. However, when the abundance of specific phylogenetic groups is of interest, researchers need to be aware of the biases their methods introduce. This is particularly relevant if different methods are used for DNA and RNA extraction, in addition to using RNA protection agents only for RNA

  9. Method Development for Extraction and Quantification of Glycosides in Leaves of Stevia Rebaudiana

    International Nuclear Information System (INIS)

    Salmah Moosa; Hazlina Ahmad Hassali; Norazlina Noordin

    2015-01-01

    A solid-liquid extraction and an UHPLC method for determination of glycosides from the leave parts of Stevia rebaudiana were developed. Steviol glycosides found in the leaves of Stevia are natural sweetener and commercially sold as sugar substitutes. Extraction of the glycosides consisted of solvent extraction of leaf powder using various solvents followed by its concentration using rotary evaporator and analysis using Ultra High Performance Liquid Chromatography (UHPLC). Existing analytical methods are mainly focused on the quantification of either rebaudioside A or stevioside, whereas other glycosides, such as rebaudioside B and rebaudioside D present in the leaves also contribute to sweetness or its biological activity. Therefore, we developed an improved method by changing the UHPLC conditions to enable a rapid and reliable determination of four steviol glycosides rather than just two using an isocratic UHPLC method. (author)

  10. Evaluation of two methods DNA extraction from formalin-fixed, paraffin-embedded tissues on non-optimal conditions

    International Nuclear Information System (INIS)

    Bustamante, Javier Andres; Astudillo, Miryam; Pazos, Alvaro Jairo; Bravo, Luis Eduardo

    2011-01-01

    Paraffin wax embedded tissues are an invaluable material for retrospective studies requiring the application of molecular analysis. Multiple methods are available to extract DNA from these kinds of samples. However, the most common methods are slow and the reagents often contribute to the fragmentation of genetic material. In order to optimize the procedure, two methods for DNA extraction from paraffin embedded tissue non-optimal conditions were used. 47 blocks containing paraffin-embedded biopsies of pleura, lung and pericardium from 24 patients (66.6% males) older than 18 years, with biopsy proven chronic granulomatous inflammation referred to the department of pathology at University Hospital of Valle between 2002 and 2007 were selected. Each sample was subjected to 10 cuts and was to two methods of DNA extraction: 1. conventional and 2. QIAamp - DNA mini kit. The efficiency of the extracted DNA was assessed by spectrophotometry and PCR amplification of a fragment of the housekeeping gene GAPDH. The concentration of DNA samples extracted by the conventional method was of 65.52 ng/Mu l ± 11.47 (mean ± SE) and the 260/280 absorbance ratio ranged between 0.52 and 2.30 the average concentration of DNA of the samples extracted by the commercial method was 60.89 ng/Mu l ± 6.02 (mean ± SE), with an absorbance that fluctuated between 0 and 2.64. The DNA obtained was amplified by PCR, of 47 samples extracted by methods, 25 and 23 respectively the GAPDH gene amplified successfully. The methods used to obtain DNA showed similar performance, highlighting the potential utility of both extraction methods for the retrospective studies from paraffin embedded tissues in unsuitable conditions.

  11. Mechanomyographic Parameter Extraction Methods: An Appraisal for Clinical Applications

    Directory of Open Access Journals (Sweden)

    Morufu Olusola Ibitoye

    2014-12-01

    Full Text Available The research conducted in the last three decades has collectively demonstrated that the skeletal muscle performance can be alternatively assessed by mechanomyographic signal (MMG parameters. Indices of muscle performance, not limited to force, power, work, endurance and the related physiological processes underlying muscle activities during contraction have been evaluated in the light of the signal features. As a non-stationary signal that reflects several distinctive patterns of muscle actions, the illustrations obtained from the literature support the reliability of MMG in the analysis of muscles under voluntary and stimulus evoked contractions. An appraisal of the standard practice including the measurement theories of the methods used to extract parameters of the signal is vital to the application of the signal during experimental and clinical practices, especially in areas where electromyograms are contraindicated or have limited application. As we highlight the underpinning technical guidelines and domains where each method is well-suited, the limitations of the methods are also presented to position the state of the art in MMG parameters extraction, thus providing the theoretical framework for improvement on the current practices to widen the opportunity for new insights and discoveries. Since the signal modality has not been widely deployed due partly to the limited information extractable from the signals when compared with other classical techniques used to assess muscle performance, this survey is particularly relevant to the projected future of MMG applications in the realm of musculoskeletal assessments and in the real time detection of muscle activity.

  12. Comparative analysis of different methods of extraction of present hydrocarbons in industrial residual waters

    International Nuclear Information System (INIS)

    Santa, Judith Rocio; Serrano, Martin; Stashenko, Elena

    2002-01-01

    A comparison among four extraction techniques such as: liquid - liquid (LLE) continuous and for lots, solid phase extraction (SPE), solid phase micro extraction (SPME) and static headspace (S-HS) was carried out. The main purpose of this research was to determine the highest recovery efficiencies and how reproducible the tests are while varying parameters such as time, extraction technique, type of solvents and others. Chromatographic parameters were optimized in order to carry out the analyses. Hydrocarbon's quantification of residual waters was achieved by using a high-resolution gas chromatography with a gas flame ionization detector (HRGC-FID). Validation of the method was carried out by analyzing real samples taken in different sampling places of the residual waters treatment plant of Ecopetrol - Barrancabermeja. The use of extraction methods that require big solvent quantities and long time for analysis are losing validity day by day. Techniques such as the HS-SPME and static HS are offered as alternatives for quantifying hydrocarbons. They show total lack of solvents, high sensibility, selectivity and the techniques are reproducible. Solid phase micro extraction (SPME) and static headspace (static HS) techniques were chosen as the extraction techniques to validate the method in real samples. Both techniques showed similar results for the determination of total hydrocarbons (in the gasoline range)

  13. Validated Method for the Characterization and Quantification of Extractable and Nonextractable Ellagitannins after Acid Hydrolysis in Pomegranate Fruits, Juices, and Extracts.

    Science.gov (United States)

    García-Villalba, Rocío; Espín, Juan Carlos; Aaby, Kjersti; Alasalvar, Cesarettin; Heinonen, Marina; Jacobs, Griet; Voorspoels, Stefan; Koivumäki, Tuuli; Kroon, Paul A; Pelvan, Ebru; Saha, Shikha; Tomás-Barberán, Francisco A

    2015-07-29

    Pomegranates are one of the main highly valuable sources of ellagitannins. Despite the potential health benefits of these compounds, reliable data on their content in pomegranates and derived extracts and food products is lacking, as it is usually underestimated due to their complexity, diversity, and lack of commercially available standards. This study describes a new method for the analysis of the extractable and nonextractable ellagitannins based on the quantification of the acid hydrolysis products that include ellagic acid, gallic acid, sanguisorbic acid dilactone, valoneic acid dilactone, and gallagic acid dilactone in pomegranate samples. The study also shows the occurrence of ellagitannin C-glycosides in pomegranates. The method was optimized using a pomegranate peel extract. To quantify nonextractable ellagitannins, freeze-dried pomegranate fruit samples were directly hydrolyzed with 4 M HCl in water at 90 °C for 24 h followed by extraction of the pellet with dimethyl sulfoxide/methanol (50:50, v/v). The method was validated and reproducibility was assessed by means of an interlaboratory trial, showing high reproducibility across six laboratories with relative standard deviations below 15%. Their applicability was demonstrated in several pomegranate extracts, different parts of pomegranate fruit (husk, peels, and mesocarp), and commercial juices. A large variability has been found in the ellagitannin content (150-750 mg of hydrolysis products/g) and type (gallagic acid/ellagic acid ratios between 4 and 0.15) of the 11 pomegranate extracts studied.

  14. Effect of soil contaminant extraction method in determining toxicity using the Microtox(reg.) assay

    International Nuclear Information System (INIS)

    Harkey, G.A.; Young, T.M.

    2000-01-01

    This project examined the influence of different extraction methods on the measured toxicity of contaminated soils collected from manufactured gas plant (MGP) sites differing in soil composition and contaminant concentration. Aged soils from a number of MGP sites were extracted using a saline solution, supercritical fluid extraction (SFE), and Soxhlet extraction. Toxicity was assessed using two forms of Microtox tests: acute aqueous tests on saline and SFE soil extracts and solid-phase tests (SPTs) on soil particles. Microtox SPTs were performed on soils before and after SFE to determine resulting toxicity reduction. Three hypotheses were tested: (1) Toxicity of soil extracts is related to contaminant concentrations of the extracts, (2) measured toxicity significantly decreases with less vigorous methods of extraction, and (3) supercritical fluid extractability correlates with measured toxicity. The EC50s for SPTs performed before and after SFE were not different for some soils but were significantly greater after extraction for other soils tested. The most significant toxicity reductions were observed for soils exhibiting the highest toxicity in both preextraction SPTs and acute aqueous tests. Acute Microtox tests performed on SFE extracts showed significantly lower EC50s than those reported from saline-based extraction procedures. Toxicity of the soils measured by Microtox SPTs was strongly correlated with both SFE efficiency and measures of contaminant aging. Data from this project provide evidence of sequestration and reduced availability of polycyclic aromatic hydrocarbons (PAHs) from soils extracted via physiologically based procedures compared to vigorous physical extraction protocols

  15. Social network extraction based on Web: 3. the integrated superficial method

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  16. Comparison of different methods for extraction from Tetraclinis articulata: yield, chemical composition and antioxidant activity.

    Science.gov (United States)

    Herzi, Nejia; Bouajila, Jalloul; Camy, Séverine; Romdhane, Mehrez; Condoret, Jean-Stéphane

    2013-12-15

    In the present study, three techniques of extraction: hydrodistillation (HD), solvent extraction (conventional 'Soxhlet' technique) and an innovative technique, i.e., the supercritical fluid extraction (SFE), were applied to ground Tetraclinis articulata leaves and compared for extraction duration, extraction yield, and chemical composition of the extracts as well as their antioxidant activities. The extracts were analyzed by GC-FID and GC-MS. The antioxidant activity was measured using two methods: ABTS(•+) and DPPH(•). The yield obtained using HD, SFE, hexane and ethanol Soxhlet extractions were found to be 0.6, 1.6, 40.4 and 21.2-27.4 g/kg respectively. An original result of this study is that the best antioxidant activity was obtained with an SFE extract (41 mg/L). The SFE method offers some noteworthy advantages over traditional alternatives, such as shorter extraction times, low environmental impact, and a clean, non-thermally-degraded final product. Also, a good correlation between the phenolic contents and the antioxidant activity was observed with extracts obtained by SFE at 9 MPa. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. The GC/MS Analysis of Volatile Components Extracted by Different Methods from Exocarpium Citri Grandis

    Directory of Open Access Journals (Sweden)

    Zhisheng Xie

    2013-01-01

    Full Text Available Volatile components from Exocarpium Citri Grandis (ECG were, respectively, extracted by three methods, that is, steam distillation (SD, headspace solid-phase microextraction (HS-SPME, and solvent extraction (SE. A total of 81 compounds were identified by gas chromatography-mass spectrometry including 77 (SD, 56 (HS-SPME, and 48 (SE compounds, respectively. Despite of the extraction method, terpenes (39.98~57.81% were the main volatile components of ECG, mainly germacrene-D, limonene, 2,6,8,10,14-hexadecapentaene, 2,6,11,15-tetramethyl-, (E,E,E-, and trans-caryophyllene. Comparison was made among the three methods in terms of extraction profile and property. SD relatively gave an entire profile of volatile in ECG by long-time extraction; SE enabled the analysis of low volatility and high molecular weight compounds but lost some volatiles components; HS-SPME generated satisfactory extraction efficiency and gave similar results to those of SD at analytical level when consuming less sample amount, shorter extraction time, and simpler procedure. Although SD and SE were treated as traditionally preparative extractive techniques for volatiles in both small batches and large scale, HS-SPME coupled with GC/MS could be useful and appropriative for the rapid extraction and qualitative analysis of volatile components from medicinal plants at analytical level.

  18. Extraction of Th and U from Swiss granites

    International Nuclear Information System (INIS)

    Bajo, C.

    1980-12-01

    The extraction, at the laboratory level, of U and Th from Swiss granites is discussed. The Mittagfluh, Bergell and Rotondo granites and the Giuv syenite offered a wide range of U and Th concentrations; 7.7 to 20.0 ppm U and 25.5 to 67.0 ppm Th. U and Th were determined in the leach solutions by the fission track method and by spectrophotometry, respectively. Samples containing less than 0.3 μg U and 4 μg Th, could be measured with an accuracy of 10% for U and 5% for Th. Leach tests were performed during which the following parameters were varied: granite-type, grain size, acid-type, acid concentration, temperature and time. There were very great leaching differences between the granites studied. Temperature was the most important parameter. Sharp differences in extraction occurred between 20 0 C, 50 0 C and 80 0 C. At 80 0 C, more than 85% U and Th were extracted. The extraction curve (percent extracted as a function of time) of aliquots sampled after 1, 2, 4, 8, 12 and 24 hours showed a plateau after 8 hours. The half life of the reaction was between one and two hours. As a general rule, Th was better extracted than U. (Auth.)

  19. Extracting Characteristics of the Study Subjects from Full-Text Articles.

    Science.gov (United States)

    Demner-Fushman, Dina; Mork, James G

    Characteristics of the subjects of biomedical research are important in determining if a publication describing the research is relevant to a search. To facilitate finding relevant publications, MEDLINE citations provide Medical Subject Headings that describe the subjects' characteristics, such as their species, gender, and age. We seek to improve the recommendation of these headings by the Medical Text Indexer (MTI) that supports manual indexing of MEDLINE. To that end, we explore the potential of the full text of the publications. Using simple recall-oriented rule-based methods we determined that adding sentences extracted from the methods sections and captions to the abstracts prior to MTI processing significantly improved recall and F1 score with only a slight drop in precision. Improvements were also achieved in directly assigning several headings extracted from the full text. These results indicate the need for further development of automated methods capable of leveraging the full text for indexing.

  20. Adaptation Method Bligh & Dyer a Lipid Extraction of Colomb ian Microalgas Biodiesel Production for Third Generation

    Directory of Open Access Journals (Sweden)

    González Delgado Ángel

    2012-06-01

    Full Text Available In the biodiesel production process from microalgae, the cell disruption and lipid extraction stages are important for obtaining triglycerides that can be transesterified to biodiesel and glycerol. In this work, the Bligh & Dyer method was adapted for lipid extraction from native microalgae using organosolv pretreatment or acid hydrolysis as cell disruption mechanism for improve the extraction process. Chloroform-methanol-water are the solvents employed in the Bligh & Dyer extraction method. The microalgae species Botryococcus braunii, Nannocloropsis, Closterium, Guinardia y Amphiprora were employed for the experimental part. Adaptation of the method was found the best extraction conditions, these were: 1:20 biomass/solvent ratio, initial ratio solvents CHCl3:CH3OH:H2O (1:2:0, stirring conditions of 5000 rpm for 14 minutes and centrifuge of 3400 rpm for 15 minutes. The cell disruption mechanisms allowed to obtain extracts with high lipid content after performing the extraction with Bligh & Dyer method, but decreases significantly the total extraction yield. Finally, the fatty acids profiles showed that Botryococcus braunii specie contains higher acylglycerol percentage area suitable for the production of biodiesel.

  1. An effective placental cotyledons proteins extraction method for 2D gel electrophoresis.

    Science.gov (United States)

    Tan, Niu J; Daim, Leona D J; Jamil, Amilia A M; Mohtarrudin, Norhafizah; Thilakavathy, Karuppiah

    2017-03-01

    Effective protein extraction is essential especially in producing a well-resolved proteome on 2D gels. A well-resolved placental cotyledon proteome, with good reproducibility, have allowed researchers to study the proteins underlying the physiology and pathophysiology of pregnancy. The aim of this study is to determine the best protein extraction protocol for the extraction of protein from placental cotyledons tissues for a two-dimensional gel electrophoresis (2D-GE). Based on widely used protein extraction strategies, 12 different extraction methodologies were carefully selected, which included one chemical extraction, two mechanical extraction coupled protein precipitations, and nine chemical extraction coupled protein precipitations. Extracted proteins were resolved in a one-dimensional gel electrophoresis and 2D-GE; then, it was compared with set criteria: extraction efficacy, protein resolution, reproducibility, and recovery efficiency. Our results revealed that a better profile was obtained by chemical extraction in comparison to mechanical extraction. We further compared chemical extraction coupled protein precipitation methodologies, where the DNase/lithium chloride-dense sucrose homogenization coupled dichloromethane-methanol precipitation (DNase/LiCl-DSH-D/MPE) method showed good protein extraction efficiency. This, however, was carried out with the best protein resolution and proteome reproducibility on 2D-gels. DNase/LiCl-DSH-D/MPE was efficient in the extraction of proteins from placental cotyledons tissues. In addition, this methodology could hypothetically allow the protein extraction of any tissue that contains highly abundant lipid and glycogen. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The 1% Rule in Four Digital Health Social Networks: An Observational Study

    Science.gov (United States)

    2014-01-01

    Background In recent years, cyberculture has informally reported a phenomenon named the 1% rule, or 90-9-1 principle, which seeks to explain participatory patterns and network effects within Internet communities. The rule states that 90% of actors observe and do not participate, 9% contribute sparingly, and 1% of actors create the vast majority of new content. This 90%, 9%, and 1% are also known as Lurkers, Contributors, and Superusers, respectively. To date, very little empirical research has been conducted to verify the 1% rule. Objective The 1% rule is widely accepted in digital marketing. Our goal was to determine if the 1% rule applies to moderated Digital Health Social Networks (DHSNs) designed to facilitate behavior change. Methods To help gain insight into participatory patterns, descriptive data were extracted from four long-standing DHSNs: the AlcoholHelpCenter, DepressionCenter, PanicCenter, and StopSmokingCenter sites. Results During the study period, 63,990 actors created 578,349 posts. Less than 25% of actors made one or more posts. The applicability of the 1% rule was confirmed as Lurkers, Contributors, and Superusers accounted for a weighted average of 1.3% (n=4668), 24.0% (n=88,732), and 74.7% (n=276,034) of content. Conclusions The 1% rule was consistent across the four DHSNs. As social network sustainability requires fresh content and timely interactions, these results are important for organizations actively promoting and managing Internet communities. Superusers generate the vast majority of traffic and create value, so their recruitment and retention is imperative for long-term success. Although Lurkers may benefit from observing interactions between Superusers and Contributors, they generate limited or no network value. The results of this study indicate that DHSNs may be optimized to produce network effects, positive externalities, and bandwagon effects. Further research in the development and expansion of DHSNs is required. PMID:24496109

  3. A novel method of genomic DNA extraction for Cactaceae1

    Science.gov (United States)

    Fehlberg, Shannon D.; Allen, Jessica M.; Church, Kathleen

    2013-01-01

    • Premise of the study: Genetic studies of Cactaceae can at times be impeded by difficult sampling logistics and/or high mucilage content in tissues. Simplifying sampling and DNA isolation through the use of cactus spines has not previously been investigated. • Methods and Results: Several protocols for extracting DNA from spines were tested and modified to maximize yield, amplification, and sequencing. Sampling of and extraction from spines resulted in a simplified protocol overall and complete avoidance of mucilage as compared to typical tissue extractions. Sequences from one nuclear and three plastid regions were obtained across eight genera and 20 species of cacti using DNA extracted from spines. • Conclusions: Genomic DNA useful for amplification and sequencing can be obtained from cactus spines. The protocols described here are valuable for any cactus species, but are particularly useful for investigators interested in sampling living collections, extensive field sampling, and/or conservation genetic studies. PMID:25202521

  4. A sensitive method to extract DNA from biological traces present on ammunition for the purpose of genetic profiling.

    Science.gov (United States)

    Dieltjes, Patrick; Mieremet, René; Zuniga, Sofia; Kraaijenbrink, Thirsa; Pijpe, Jeroen; de Knijff, Peter

    2011-07-01

    Exploring technological limits is a common practice in forensic DNA research. Reliable genetic profiling based on only a few cells isolated from trace material retrieved from a crime scene is nowadays more and more the rule rather than the exception. On many crime scenes, cartridges, bullets, and casings (jointly abbreviated as CBCs) are regularly found, and even after firing, these potentially carry trace amounts of biological material. Since 2003, the Forensic Laboratory for DNA Research is routinely involved in the forensic investigation of CBCs in the Netherlands. Reliable DNA profiles were frequently obtained from CBCs and used to match suspects, victims, or other crime scene-related DNA traces. In this paper, we describe the sensitive method developed by us to extract DNA from CBCs. Using PCR-based genotyping of autosomal short tandem repeats, we were able to obtain reliable and reproducible DNA profiles in 163 out of 616 criminal cases (26.5%) and in 283 out of 4,085 individual CBC items (6.9%) during the period January 2003-December 2009. We discuss practical aspects of the method and the sometimes unexpected effects of using cell lysis buffer on the subsequent investigation of striation patterns on CBCs.

  5. Effect of Different Sterilization Methods on the Extracted Oil from Oil Palm Fruit

    International Nuclear Information System (INIS)

    Hasimah Kasmin; Hasimah Kasmin; Roila Awang; Azwan Mat Lazim

    2015-01-01

    Sterilization is important process during the processing of oil palm fruits in order to produce crude palm oil (CPO). This process can be carried out using steam (conventional method), dry heating or wet heating method. In this study, the effectiveness of the dry heating and wet heating method for sterilization and solvent extraction were carried out. The sterilization time of these two methods were varied at 30, 60 and 90 min in order to determine their effectiveness on the oil extraction and their quality. Results showed that, at 30 min of sterilization, the wet heating produced a higher percentage of oil extraction compared to the conventional and dry heating, with average of 27.65 %, 19.01 % and 20.21 % respectively. In comparison with the conventional method, both sterilization methods gave better FFA and DOBI results. This can be seen where the average of free fatty acid (FFA) content for proposed sterilization method was between 0.37 % to 0.93 % while, average deterioration of bleach ability index (DOBI) was from 4.89 to 6.12. The average carotene content was in agreement with the conventional method at a range of 644.64 ppm to 764.80 ppm. (author)

  6. Optimization of Ultrasonic-Assisted Extraction of Flavonoid Compounds and Antioxidants from Alfalfa Using Response Surface Method

    Directory of Open Access Journals (Sweden)

    Chang-Liang Jing

    2015-08-01

    Full Text Available Ultrasonic-assisted extraction (UAE was used to extract flavonoid-enriched antioxidants from alfalfa aerial part. Response surface methodology (RSM, based on a four-factor, five-level central composite design (CCD, was employed to obtain the optimal extraction parameters, in which the flavonoid content was maximum and the antioxidant activity of the extracts was strongest. Radical scavenging capacity of the extracts, which represents the amounts of antioxidants in alfalfa, was determined by using 2,2′-azino-bis (3-ethylbenzothiazoline-6-sulphonicacid (ABTS and 2,2′-diphenyl-1-picrylhydrazyl (DPPH methods. The results showed good fit with the proposed models for the total flavonoid extraction (R2 = 0.9849, for the antioxidant extraction assayed by ABTS method (R2 = 0.9764, and by DPPH method (R2 = 0.9806. Optimized extraction conditions for total flavonoids was a ratio of liquid to solid of 57.16 mL/g, 62.33 °C, 57.08 min, and 52.14% ethanol. The optimal extraction parameters of extracts for the highest antioxidant activity by DPPH method was a ratio of liquid to solid 60.3 mL/g, 54.56 °C, 45.59 min, and 46.67% ethanol, and by ABTS assay was a ratio of liquid to solid 47.29 mL/g, 63.73 °C, 51.62 min, and 60% ethanol concentration. Our work offers optimal extraction conditions for total flavonoids and antioxidants from alfalfa.

  7. Risk-based rules for crane safety systems

    Energy Technology Data Exchange (ETDEWEB)

    Ruud, Stian [Section for Control Systems, DNV Maritime, 1322 Hovik (Norway)], E-mail: Stian.Ruud@dnv.com; Mikkelsen, Age [Section for Lifting Appliances, DNV Maritime, 1322 Hovik (Norway)], E-mail: Age.Mikkelsen@dnv.com

    2008-09-15

    The International Maritime Organisation (IMO) has recommended a method called formal safety assessment (FSA) for future development of rules and regulations. The FSA method has been applied in a pilot research project for development of risk-based rules and functional requirements for systems and components for offshore crane systems. This paper reports some developments in the project. A method for estimating target reliability for the risk-control options (safety functions) by means of the cost/benefit decision criterion has been developed in the project and is presented in this paper. Finally, a structure for risk-based rules is proposed and presented.

  8. Risk-based rules for crane safety systems

    International Nuclear Information System (INIS)

    Ruud, Stian; Mikkelsen, Age

    2008-01-01

    The International Maritime Organisation (IMO) has recommended a method called formal safety assessment (FSA) for future development of rules and regulations. The FSA method has been applied in a pilot research project for development of risk-based rules and functional requirements for systems and components for offshore crane systems. This paper reports some developments in the project. A method for estimating target reliability for the risk-control options (safety functions) by means of the cost/benefit decision criterion has been developed in the project and is presented in this paper. Finally, a structure for risk-based rules is proposed and presented

  9. A two-stage rule-constrained seedless region growing approach for mandibular body segmentation in MRI.

    Science.gov (United States)

    Ji, Dong Xu; Foong, Kelvin Weng Chiong; Ong, Sim Heng

    2013-09-01

    Extraction of the mandible from 3D volumetric images is frequently required for surgical planning and evaluation. Image segmentation from MRI is more complex than CT due to lower bony signal-to-noise. An automated method to extract the human mandible body shape from magnetic resonance (MR) images of the head was developed and tested. Anonymous MR images data sets of the head from 12 subjects were subjected to a two-stage rule-constrained region growing approach to derive the shape of the body of the human mandible. An initial thresholding technique was applied followed by a 3D seedless region growing algorithm to detect a large portion of the trabecular bone (TB) regions of the mandible. This stage is followed with a rule-constrained 2D segmentation of each MR axial slice to merge the remaining portions of the TB regions with lower intensity levels. The two-stage approach was replicated to detect the cortical bone (CB) regions of the mandibular body. The TB and CB regions detected from the preceding steps were merged and subjected to a series of morphological processes for completion of the mandibular body region definition. Comparisons of the accuracy of segmentation between the two-stage approach, conventional region growing method, 3D level set method, and manual segmentation were made with Jaccard index, Dice index, and mean surface distance (MSD). The mean accuracy of the proposed method is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The mean accuracy of CRG is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The mean accuracy of the 3D level set method is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The proposed method shows improvement in accuracy over CRG and 3D level set. Accurate segmentation of the body of the human mandible from MR images is achieved with the

  10. Screening of extraction methods for glycoproteins from jellyfish ( Rhopilema esculentum) oral-arms by high performance liquid chromatography

    Science.gov (United States)

    Ren, Guoyan; Li, Bafang; Zhao, Xue; Zhuang, Yongliang; Yan, Mingyan; Hou, Hu; Zhang, Xiukun; Chen, Li

    2009-03-01

    In order to select an optimum extraction method for the target glycoprotein (TGP) from jellyfish ( Rhopilema esculentum) oral-arms, a high performance liquid chromatography (HPLC)-assay for the determination of the TGP was developed. Purified target glycoprotein was taken as a standard glycoprotein. The results showed that the calibration curves for peak area plotted against concentration for TGP were linear ( r = 0.9984, y = 4.5895 x+47.601) over concentrations ranging from 50 to 400 mgL-1. The mean extraction recovery was 97.84% (CV2.60%). The fractions containing TGP were isolated from jellyfish ( R. esculentum) oral-arms by four extraction methods: 1) water extraction (WE), 2) phosphate buffer solution (PBS) extraction (PE), 3) ultrasound-assisted water extraction (UA-WE), 4) ultrasound-assisted PBS extraction (UA-PE). The lyophilized extract was dissolved in Milli-Q water and analyzed directly on a short TSK-GEL G4000PWXL (7.8 mm×300 mm) column. Our results indicated that the UA-PE method was the optimum extraction method selected by HPLC.

  11. Use of laminar chromatographic methods for determination of separation conditions in column extraction chromatography

    International Nuclear Information System (INIS)

    Ghersini, G.; Cerrai, E.

    1978-01-01

    Possibilities of using laminar chromatographic methods (paper and thin-layer chromatography) to determine optimal separation conditions in column extraction chromatography are analysed. Most of the given laminar methods are presented as Rf-spectra, i.e. as dependences of Rf found experimentally on eluating solution component concentration. Interrelation between Rf and distribution coefficients of corresponding liquid extraction systems and retention volumes of chromatographic columns is considered. Literature data on extraction paper and thin-layer chromatography of elements with various immovable phases are presented

  12. Comparison of Three Different DNA Extraction Methods for Linguatula serrata as a Food Born Pathogen

    Directory of Open Access Journals (Sweden)

    Gilda ESLAMI

    2017-06-01

    Full Text Available Background: One of the most important items in molecular characterization of food-borne pathogens is high quality genomic DNA. In this study, we investigated three protocols and compared their simplicity, duration and costs for extracting genomic DNA from Linguatula serrata.Methods: The larvae were collected from the sheep’s visceral organs from the Yazd Slaughterhouse during May 2013. DNA extraction was done in three different methods, including commercial DNA extraction kit, Phenol Chloroform Isoamylalcohol (PCI, and salting out. Extracted DNA in each method was assessed for quantity and quality using spectrophotometery and agarose gel electrophoresis, respectively.Results: The less duration was regarding to commercial DNA extraction kit and then salting out protocol. The cost benefit one was salting out and then PCI method. The best quantity was regarding to PCI with 72.20±29.20 ng/μl, and purity of OD260/OD280 in 1.76±0.947. Agarose gel electrophoresis for assessing the quality found all the same.Conclusion: Salting out is introduced as the best method for DNA extraction from L. seratta as a food-borne pathogen with the least costand appropriate purity. Although, the best purity was regarding to PCI but PCI is not safe as salting out. In addition, the duration of salting out was less than PCI. The least duration was seen in commercial DNA extraction kit, but it is expensive and therefore is not recommended for developing countries where consumption of offal is common.

  13. Aggressive oil extraction and precautionary saving: Coping with volatility

    NARCIS (Netherlands)

    van der Ploeg, F.

    2010-01-01

    The effects of stochastic oil demand on optimal oil extraction paths and tax, spending and government debt policies are analyzed when the oil demand schedule is linear and preferences quadratic. Without prudence, optimal oil extraction is governed by the Hotelling rule and optimal budgetary policies

  14. Biodiesel Production from Microalgae by Extraction – Transesterification Method

    Directory of Open Access Journals (Sweden)

    Nguyen Thi Phuong Thao

    2013-11-01

    Full Text Available The environmental impact of using petroleum fuels has led to a quest to find a suitable alternative fuel source. In this study, microalgae were explored as a highly potential feedstock to produce biodiesel fuel. Firstly, algal oil is extracted from algal biomass by using organic solvents (n–hexan.  Lipid is contained in microalgae up to 60% of their weight. Then, Biodiesel is created through a chemical reaction known as transesterification between algal oil and alcohol (methanol with strong acid (such as H2SO4 as the catalyst. The extraction – transesterification method resulted in a high biodiesel yield (10 % of algal biomass and high FAMEs content (5.2 % of algal biomass. Biodiesel production from microalgae was studied through experimental investigation of transesterification conditions such as reaction time, methanol to oil ration and catalyst dosage which are deemed to have main impact on reaction conversion efficiency. All the parameters which were characterized for purified biodiesel such as free glycerin, total glycerin, flash point, sulfur content were analyzed according to ASTM standardDoi: http://dx.doi.org/10.12777/wastech.1.1.6-9Citation:  Thao, N.T.P., Tin, N.T., and Thanh, B.X. 2013. Biodiesel Production from Microalgae by Extraction – Transesterification Method. Waste Technology 1(1:6-9. Doi: http://dx.doi.org/10.12777/wastech.1.1.6-9

  15. Development and validation of a simple method for the extraction of human skin melanocytes.

    Science.gov (United States)

    Wang, Yinjuan; Tissot, Marion; Rolin, Gwenaël; Muret, Patrice; Robin, Sophie; Berthon, Jean-Yves; He, Li; Humbert, Philippe; Viennet, Céline

    2018-03-21

    Primary melanocytes in culture are useful models for studying epidermal pigmentation and efficacy of melanogenic compounds, or developing advanced therapy medicinal products. Cell extraction is an inevitable and critical step in the establishment of cell cultures. Many enzymatic methods for extracting and growing cells derived from human skin, such as melanocytes, are described in literature. They are usually based on two enzymatic steps, Trypsin in combination with Dispase, in order to separate dermis from epidermis and subsequently to provide a suspension of epidermal cells. The objective of this work was to develop and validate an extraction method of human skin melanocytes being simple, effective and applicable to smaller skin samples, and avoiding animal reagents. TrypLE™ product was tested on very limited size of human skin, equivalent of multiple 3-mm punch biopsies, and was compared to Trypsin/Dispase enzymes. Functionality of extracted cells was evaluated by analysis of viability, morphology and melanin production. In comparison with Trypsin/Dispase incubation method, the main advantages of TrypLE™ incubation method were the easier of separation between dermis and epidermis and the higher population of melanocytes after extraction. Both protocols preserved morphological and biological characteristics of melanocytes. The minimum size of skin sample that allowed the extraction of functional cells was 6 × 3-mm punch biopsies (e.g., 42 mm 2 ) whatever the method used. In conclusion, this new procedure based on TrypLE™ incubation would be suitable for establishment of optimal primary melanocytes cultures for clinical applications and research.

  16. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules

    OpenAIRE

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    2012-01-01

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present ...

  17. Comparison of Different Protein Extraction Methods for Gel-Based Proteomic Analysis of Ganoderma spp.

    Science.gov (United States)

    Al-Obaidi, Jameel R; Saidi, Noor Baity; Usuldin, Siti Rokhiyah Ahmad; Hussin, Siti Nahdatul Isnaini Said; Yusoff, Noornabeela Md; Idris, Abu Seman

    2016-04-01

    Ganoderma species are a group of fungi that have the ability to degrade lignin polymers and cause severe diseases such as stem and root rot and can infect economically important plants and perennial crops such as oil palm, especially in tropical countries such as Malaysia. Unfortunately, very little is known about the complex interplay between oil palm and Ganoderma in the pathogenesis of the diseases. Proteomic technologies are simple yet powerful tools in comparing protein profile and have been widely used to study plant-fungus interaction. A critical step to perform a good proteome research is to establish a method that gives the best quality and a wide coverage of total proteins. Despite the availability of various protein extraction protocols from pathogenic fungi in the literature, no single extraction method was found suitable for all types of pathogenic fungi. To develop an optimized protein extraction protocol for 2-DE gel analysis of Ganoderma spp., three previously reported protein extraction protocols were compared: trichloroacetic acid, sucrose and phenol/ammonium acetate in methanol. The third method was found to give the most reproducible gels and highest protein concentration. Using the later method, a total of 10 protein spots (5 from each species) were successfully identified. Hence, the results from this study propose phenol/ammonium acetate in methanol as the most effective protein extraction method for 2-DE proteomic studies of Ganoderma spp.

  18. Evaluation of DNA Extraction Methods Suitable for PCR-based Detection and Genotyping of Clostridium botulinum

    DEFF Research Database (Denmark)

    Auricchio, Bruna; Anniballi, Fabrizio; Fiore, Alfonsina

    2013-01-01

    in terms of cost, time, labor, and supplies. Eleven botulinum toxin–producing clostridia strains and 25 samples (10 food, 13 clinical, and 2 environmental samples) naturally contaminated with botulinum toxin–producing clostridia were used to compare 4 DNA extraction procedures: Chelex® 100 matrix, Phenol......Sufficient quality and quantity of extracted DNA is critical to detecting and performing genotyping of Clostridium botulinum by means of PCR-based methods. An ideal extraction method has to optimize DNA yield, minimize DNA degradation, allow multiple samples to be extracted, and be efficient...

  19. A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints

    Science.gov (United States)

    Estiningsih, Y.; Farikhin; Tjahjana, R. H.

    2018-03-01

    Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.

  20. PPI-IRO: A two-stage method for protein-protein interaction extraction based on interaction relation ontology

    KAUST Repository

    Li, Chuanxi

    2014-01-01

    Mining Protein-Protein Interactions (PPIs) from the fast-growing biomedical literature resources has been proven as an effective approach for the identifi cation of biological regulatory networks. This paper presents a novel method based on the idea of Interaction Relation Ontology (IRO), which specifi es and organises words of various proteins interaction relationships. Our method is a two-stage PPI extraction method. At fi rst, IRO is applied in a binary classifi er to determine whether sentences contain a relation or not. Then, IRO is taken to guide PPI extraction by building sentence dependency parse tree. Comprehensive and quantitative evaluations and detailed analyses are used to demonstrate the signifi cant performance of IRO on relation sentences classifi cation and PPI extraction. Our PPI extraction method yielded a recall of around 80% and 90% and an F1 of around 54% and 66% on corpora of AIMed and Bioinfer, respectively, which are superior to most existing extraction methods. Copyright © 2014 Inderscience Enterprises Ltd.

  1. Extraction method for the determination of inorganic iodides in Rose Bengal labelled with 131I

    International Nuclear Information System (INIS)

    Lengyel, J.; Krtil, J.; Vecernik, J.

    1982-01-01

    An extraction method for the determination of inorganic iodides in Rose Bengal preparations labelled with 131 I is described. The method is based on the quantitative extraction of Rose Bengal into chloroform from acidic medium while the inorganic iodides remain in the aqueous phase. The method is simple, rapid, and reproducible. (author)

  2. Measurement of radical scavenging activity of irradiated Kampo extracts using ESR spin-trap method

    International Nuclear Information System (INIS)

    Ohta, Yui; Kawamura, Shoei; Ukai, Mitsuko; Nakamura, Hideo; Kikuchi, Masahiro; Kobayashi, Yasuhiko

    2014-01-01

    The radical scavenging activity (RSA) of 13 kinds of γ-ray irradiated Kampo extracts were studied by ESR spin-trap method. The RSA against alkoxy radical and hydroxyl radical were measured using new spin trapping reagent CYPMPO. The RSA against these two radicals were evaluated using GSH for alkoxy RSA and L-ascorbic acid for hydroxy RSA as a standard antioxidant reagent. We revealed that a few Kampo extracts showed high RSA against alkoxy radical and also hydroxy radical. This RSA of Kampo extracts was changed by γ-ray irradiation treatment. Using ESR spin-trap method, it is concluded that the effect of radiation treatment on RSA of Kampo extracts were able to detect. (author)

  3. Optimization of cloud point extraction and solid phase extraction methods for speciation of arsenic in natural water using multivariate technique.

    Science.gov (United States)

    Baig, Jameel A; Kazi, Tasneem G; Shah, Abdul Q; Arain, Mohammad B; Afridi, Hassan I; Kandhro, Ghulam A; Khan, Sumaira

    2009-09-28

    The simple and rapid pre-concentration techniques viz. cloud point extraction (CPE) and solid phase extraction (SPE) were applied for the determination of As(3+) and total inorganic arsenic (iAs) in surface and ground water samples. The As(3+) was formed complex with ammonium pyrrolidinedithiocarbamate (APDC) and extracted by surfactant-rich phases in the non-ionic surfactant Triton X-114, after centrifugation the surfactant-rich phase was diluted with 0.1 mol L(-1) HNO(3) in methanol. While total iAs in water samples was adsorbed on titanium dioxide (TiO(2)); after centrifugation, the solid phase was prepared to be slurry for determination. The extracted As species were determined by electrothermal atomic absorption spectrometry. The multivariate strategy was applied to estimate the optimum values of experimental factors for the recovery of As(3+) and total iAs by CPE and SPE. The standard addition method was used to validate the optimized methods. The obtained result showed sufficient recoveries for As(3+) and iAs (>98.0%). The concentration factor in both cases was found to be 40.

  4. INVESTIGATION OF METHODS OF DNA EXTRACTION FROM PLANT ORIGIN OBJECTS AND FOODS BASED ON THEM

    Directory of Open Access Journals (Sweden)

    L. S. Dyshlyuk

    2014-01-01

    Full Text Available For the last decades modern and highly efficient methods of determining the quality and safety of food products, based on the application of the latest scientific achievements were developed in the world. A special place is given to the methods based on achievements of molecular biology and genetics. At the present stage of development in the field of assessing the quality of raw materials and processed food products much attention is given to highly accurate, sensitive and specific research methods, the method of polymerase chain reaction (PCR occupying a leading place among them. PCR is a sophisticated method that simulates the natural DNA replication and allows to detect a single specific DNA molecule in the presence of millions of other molecules. The key point in the preparation of material for PCR is the extraction of nucleic acids. The low content of DNA in plant material and the high concentration of secondary metabolites complicate the process of extraction. The key solution to this problem is highly effective method of extraction, which allows to obtain the DNA of adequate quality and purity. Comparative analysis of methods for the extraction of nucleic acids from fruit raw materials and products based on them was carried out in the study. General analysis of the experimental data allowed us to determine the most efficient method for DNA extracting. In the comparative analysis it was found out that to extract DNA from plant raw materials and food products prepared on their basis it is the most suitable to use "Sorb-GMO-A" reactants kit (set. The approach described gives us a brilliant opportunity to obtain deoxyribonucleic acid proper quality and purity.

  5. Extraction of basil leaves (ocimum canum) oleoresin with ethyl acetate solvent by using soxhletation method

    Science.gov (United States)

    Tambun, R.; Purba, R. R. H.; Ginting, H. K.

    2017-09-01

    The goal of this research is to produce oleoresin from basil leaves (Ocimum canum) by using soxhletation method and ethyl acetate as solvent. Basil commonly used in culinary as fresh vegetables. Basil contains essential oils and oleoresin that are used as flavouring agent in food, in cosmetic and ingredient in traditional medicine. The extraction method commonly used to obtain oleoresin is maceration. The problem of this method is many solvents necessary and need time to extract the raw material. To resolve the problem and to produce more oleoresin, we use soxhletation method with a combination of extraction time and ratio from the material with a solvent. The analysis consists of yield, density, refractive index, and essential oil content. The best treatment of basil leaves oleoresin extraction is at ratio of material and solvent 1:6 (w / v) for 6 hours extraction time. In this condition, the yield of basil oleoresin is 20.152%, 0.9688 g/cm3 of density, 1.502 of refractive index, 15.77% of essential oil content, and the colour of oleoresin product is dark-green.

  6. A new extraction method of bioflavanoids from poisonous plant (Gratiola Officinalis L.

    Directory of Open Access Journals (Sweden)

    Natalya V. Polukonova

    2014-09-01

    Full Text Available The way of vegetable raw materials extraction which allows to receive nontoxical composition of biological active agents from poisonous plants such as Gratiola officinalis L. was described. The alkaloids exit changes with the increase of ethyl alcohol percentage (from 15% to 96%. The extract was obtained using 96% ethanol and did not give positive high quality reaction to the content of alkaloids. The chemical composition with new nontoxical biological active composition of Gratiola officinalis L. extract was investigated. The extract contains a previously unknown plant – bioflavonoid quercetin. The average value of quercetin in this extract using the calibration curve of the standard sample quercetin (98% Sigma is 0.66%. In the dry rest of extractive substances (Gratiola officinalis L. the quantity of quercetin was 350 mkg (obtained from 10 g of a dry grass as was established by the method of a liquid chromatography.

  7. Alternative oil extraction methods from Echium plantagineum L. seeds using advanced techniques and green solvents.

    Science.gov (United States)

    Castejón, Natalia; Luna, Pilar; Señoráns, Francisco J

    2018-04-01

    The edible oil processing industry involves large losses of organic solvent into the atmosphere and long extraction times. In this work, fast and environmentally friendly alternatives for the production of echium oil using green solvents are proposed. Advanced extraction techniques such as Pressurized Liquid Extraction (PLE), Microwave Assisted Extraction (MAE) and Ultrasound Assisted Extraction (UAE) were evaluated to efficiently extract omega-3 rich oil from Echium plantagineum seeds. Extractions were performed with ethyl acetate, ethanol, water and ethanol:water to develop a hexane-free processing method. Optimal PLE conditions with ethanol at 150 °C during 10 min produced a very similar oil yield (31.2%) to Soxhlet using hexane for 8 h (31.3%). UAE optimized method with ethanol at mild conditions (55 °C) produced a high oil yield (29.1%). Consequently, advanced extraction techniques showed good lipid yields and furthermore, the produced echium oil had the same omega-3 fatty acid composition than traditionally extracted oil. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Comparison of Some Extraction Methods for Isolation of Catechins and Caffeine from Turkish Green Tea

    Directory of Open Access Journals (Sweden)

    Ezgi DEMİR

    2015-07-01

    Full Text Available Effective extraction of anticancer and antioxidant principles from Turkish green tea were main purpose of this work. The pre-optimized experimental condition for liquid extraction was employed for comparative appraisal.  Not only extraction methods also nature of the green tea samples (fresh, dried or frozen and quantitative yields related to collection periods were investigated.  After extraction of the green tea with various techniques the extract was partitioned with chloroform to remove caffeine, after that the extract was partitioned with ethyl acetate to obtain catechin mixture. Quantification of individual catechins was carried out by HPLC and analysis results proved that epigallocatechin gallate (EGCG was main catechin specie present in all extracts. The results indicate that hot water extraction (at 80 0C provides higher catechin yield when compared to other methods. The highest extract yields were obtained with dried leaves collected in second collection period. The crude catechin mixture contains high amount of EGCG and might be used as raw material for production of plant remedies at industrial scale.

  9. The N-end rule pathway catalyzes a major fraction of the protein degradation in skeletal muscle

    Science.gov (United States)

    Solomon, V.; Lecker, S. H.; Goldberg, A. L.

    1998-01-01

    In skeletal muscle, overall protein degradation involves the ubiquitin-proteasome system. One property of a protein that leads to rapid ubiquitin-dependent degradation is the presence of a basic, acidic, or bulky hydrophobic residue at its N terminus. However, in normal cells, substrates for this N-end rule pathway, which involves ubiquitin carrier protein (E2) E214k and ubiquitin-protein ligase (E3) E3alpha, have remained unclear. Surprisingly, in soluble extracts of rabbit muscle, we found that competitive inhibitors of E3alpha markedly inhibited the 125I-ubiquitin conjugation and ATP-dependent degradation of endogenous proteins. These inhibitors appear to selectively inhibit E3alpha, since they blocked degradation of 125I-lysozyme, a model N-end rule substrate, but did not affect the degradation of proteins whose ubiquitination involved other E3s. The addition of several E2s or E3alpha to the muscle extracts stimulated overall proteolysis and ubiquitination, but only the stimulation by E3alpha or E214k was sensitive to these inhibitors. A similar general inhibition of ubiquitin conjugation to endogenous proteins was observed with a dominant negative inhibitor of E214k. Certain substrates of the N-end rule pathway are degraded after their tRNA-dependent arginylation. We found that adding RNase A to muscle extracts reduced the ATP-dependent proteolysis of endogenous proteins, and supplying tRNA partially restored this process. Finally, although in muscle extracts the N-end rule pathway catalyzes most ubiquitin conjugation, it makes only a minor contribution to overall protein ubiquitination in HeLa cell extracts.

  10. Extraction and Determination of Cyproheptadine in Human Urine by DLLME-HPLC Method.

    Science.gov (United States)

    Maham, Mehdi; Kiarostami, Vahid; Waqif-Husain, Syed; Abroomand-Azar, Parviz; Tehrani, Mohammad Saber; Khoeini Sharifabadi, Malihe; Afrouzi, Hossein; Shapouri, Mahmoudreza; Karami-Osboo, Rouhollah

    2013-01-01

    Novel dispersive liquid-liquid microextraction (DLLME), coupled with high performance liquid chromatography with photodiode array detection (HPLC-DAD) has been applied for the extraction and determination of cyproheptadine (CPH), an antihistamine, in human urine samples. In this method, 0.6 mL of acetonitrile (disperser solvent) containing 30 μL of carbon tetrachloride (extraction solvent) was rapidly injected by a syringe into 5 mL urine sample. After centrifugation, the sedimented phase containing enriched analyte was dissolved in acetonitrile and an aliquot of this solution injected into the HPLC system for analysis. Development of DLLME procedure includes optimization of some important parameters such as kind and volume of extraction and disperser solvent, pH and salt addition. The proposed method has good linearity in the range of 0.02-4.5 μg mL(-1) and low detection limit (13.1 ng mL(-1)). The repeatability of the method, expressed as relative standard deviation was 4.9% (n = 3). This method has also been applied to the analysis of real urine samples with satisfactory relative recoveries in the range of 91.6-101.0%.

  11. Vanadium extraction from slimes by the lime-bicarbonate method

    International Nuclear Information System (INIS)

    Lishchenko, T.V.; Vdovina, L.V.; Slobodchikova, R.I.

    1978-01-01

    Some main parameters of the lime-bicarbonate method of extracting vanadium from residues obtained in washing waters of mazut boilers on thermal stations have been determined. To study the process of vanadium extraction during caking of the residues with lime and subsequent leaching of water-soluble vanadium, a ''Minsk-22'' computer has been used for computation. Analysis of the equation derived has shown that a change in temperature of vanadium leaching, density of pulp, and a kind of heating of the charge affect the process only slightly. It has also been shown that the calcination temperature is expedient to be kept above 850 deg C and consumption temperature is expedient to be kept above 85O deg C and consumption of lime must not exceed 20% of the residues weight. Bicarbonate consumption exerts a decisive influence on completeness of vanadium extraction and must be increased up to >35%; duration of leaching should be raised up to 30-45 minutes. With increasing calcination temperature the duration of leaching decreases. When temperature and duration of calcination increase, the formation of water-soluble vanadium intensifies. With the aid of optimization program seven variants have been chosen, which ensure vanadium extraction into solution by 95-100%

  12. Comparison of some classification algorithms based on deterministic and nondeterministic decision rules

    KAUST Repository

    Delimata, Paweł

    2010-01-01

    We discuss two, in a sense extreme, kinds of nondeterministic rules in decision tables. The first kind of rules, called as inhibitory rules, are blocking only one decision value (i.e., they have all but one decisions from all possible decisions on their right hand sides). Contrary to this, any rule of the second kind, called as a bounded nondeterministic rule, can have on the right hand side only a few decisions. We show that both kinds of rules can be used for improving the quality of classification. In the paper, two lazy classification algorithms of polynomial time complexity are considered. These algorithms are based on deterministic and inhibitory decision rules, but the direct generation of rules is not required. Instead of this, for any new object the considered algorithms extract from a given decision table efficiently some information about the set of rules. Next, this information is used by a decision-making procedure. The reported results of experiments show that the algorithms based on inhibitory decision rules are often better than those based on deterministic decision rules. We also present an application of bounded nondeterministic rules in construction of rule based classifiers. We include the results of experiments showing that by combining rule based classifiers based on minimal decision rules with bounded nondeterministic rules having confidence close to 1 and sufficiently large support, it is possible to improve the classification quality. © 2010 Springer-Verlag.

  13. Various Extraction Methods for Obtaining Stilbenes from Grape Cane of Vitis vinifera L.

    Directory of Open Access Journals (Sweden)

    Ivo Soural

    2015-04-01

    Full Text Available Grape cane, leaves and grape marc are waste products from viticulture, which can be used to obtain secondary stilbene derivatives with high antioxidant value. The presented work compares several extraction methods: maceration at laboratory temperature, extraction at elevated temperature, fluidized-bed extraction, Soxhlet extraction, microwave-assisted extraction, and accelerated solvent extraction. To obtain trans-resveratrol, trans-ε-viniferin and r2-viniferin from grape cane of the V. vinifera variety Cabernet Moravia, various conditions were studied: different solvents, using powdered versus cut cane material, different extraction times, and one-step or multiple extractions. The largest concentrations found were 6030 ± 680 µg/g dry weight (d.w. for trans-resveratrol, 2260 ± 90 µg/g d.w. for trans-ε-viniferin, and 510 ± 40 µg/g d.w. for r2-viniferin. The highest amounts of stilbenes (8500 ± 1100 µg/g d.w. were obtained using accelerated solvent extraction in methanol.

  14. Antioxidative activity of ethanol extracts from Spirulina platensis and Nostoc linckia measured by various methods

    Directory of Open Access Journals (Sweden)

    Liliana CEPOI

    2009-11-01

    Full Text Available The goal of this work is to determine the level of antioxidative activity of various ethanol extracts from Spirulina platensis and Nostoc linckia biomass, and also to demonstrate the possibility to select suitable methods for evaluation of these characteristics. The methods for determination of antioxidative activity were selected concerning their possible use for complex preparations: phosphomolybdenum method for evaluation of antioxidant capacity (PMRC, radical-scavenging activity by DPPH method (DPPH, antioxidant activity by the ABTS+ radical cation assay (ABTS, Folin-Ciocalteu reducing capacity (FCRC. We showed the presence of antioxidative substances in ethanol extractions from 2 species of cyanobacteria, and possibility to increase their activity varying ethanol concentration. It facilitates the extraction both water- and lipid-soluble components from biomass. Regarding used methods for antioxidative activity determination, we have used only those based on reaction of electrons return (which widely used nowadays in vitro. Obtained in different ways results demonstrate high reduction capacity of the extracts and possibility to select suitable analytical methods for each case.

  15. Validation for Vegetation Green-up Date Extracted from GIMMS NDVI and NDVI3g Using Variety of Methods

    Science.gov (United States)

    Chang, Q.; Jiao, W.

    2017-12-01

    Phenology is a sensitive and critical feature of vegetation change that has regarded as a good indicator in climate change studies. So far, variety of remote sensing data sources and phenology extraction methods from satellite datasets have been developed to study the spatial-temporal dynamics of vegetation phenology. However, the differences between vegetation phenology results caused by the varies satellite datasets and phenology extraction methods are not clear, and the reliability for different phenology results extracted from remote sensing datasets is not verified and compared using the ground observation data. Based on three most popular remote sensing phenology extraction methods, this research calculated the Start of the growing season (SOS) for each pixels in the Northern Hemisphere for two kinds of long time series satellite datasets: GIMMS NDVIg (SOSg) and GIMMS NDVI3g (SOS3g). The three methods used in this research are: maximum increase method, dynamic threshold method and midpoint method. Then, this study used SOS calculated from NEE datasets (SOS_NEE) monitored by 48 eddy flux tower sites in global flux website to validate the reliability of six phenology results calculated from remote sensing datasets. Results showed that both SOSg and SOS3g extracted by maximum increase method are not correlated with ground observed phenology metrics. SOSg and SOS3g extracted by the dynamic threshold method and midpoint method are both correlated with SOS_NEE significantly. Compared with SOSg extracted by the dynamic threshold method, SOSg extracted by the midpoint method have a stronger correlation with SOS_NEE. And, the same to SOS3g. Additionally, SOSg showed stronger correlation with SOS_NEE than SOS3g extracted by the same method. SOS extracted by the midpoint method from GIMMS NDVIg datasets seemed to be the most reliable results when validated with SOS_NEE. These results can be used as reference for data and method selection in future's phenology study.

  16. Using deuterated PAH amendments to validate chemical extraction methods to predict PAH bioavailability in soils

    International Nuclear Information System (INIS)

    Gomez-Eyles, Jose L.; Collins, Chris D.; Hodson, Mark E.

    2011-01-01

    Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability. - Research highlights: → Isotope ratios can be used to evaluate chemical methods to predict bioavailability. → Chemical methods predicted bioavailability better than exhaustive extractions. → Bioavailability to earthworms was still far from that predicted by chemical methods. - A novel method using isotope ratios to assess the ability of chemical methods to predict PAH bioavailability to soil biota.

  17. Using deuterated PAH amendments to validate chemical extraction methods to predict PAH bioavailability in soils

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Eyles, Jose L., E-mail: j.l.gomezeyles@reading.ac.uk [University of Reading, School of Human and Environmental Sciences, Soil Research Centre, Reading, RG6 6DW Berkshire (United Kingdom); Collins, Chris D.; Hodson, Mark E. [University of Reading, School of Human and Environmental Sciences, Soil Research Centre, Reading, RG6 6DW Berkshire (United Kingdom)

    2011-04-15

    Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability. - Research highlights: > Isotope ratios can be used to evaluate chemical methods to predict bioavailability. > Chemical methods predicted bioavailability better than exhaustive extractions. > Bioavailability to earthworms was still far from that predicted by chemical methods. - A novel method using isotope ratios to assess the ability of chemical methods to predict PAH bioavailability to soil biota.

  18. Ecosystem-based design rules for marine sand extraction sites

    NARCIS (Netherlands)

    de Jong, Maarten F.; Borsje, Bas W.; Baptist, Martin J.; van der Wal, Jan Tjalling; Lindeboom, Han J.; Hoekstra, Piet

    2016-01-01

    The demand for marine sand in the Netherlands as well as globally is increasing. Over the last decades, only shallow sand extraction of 2m below the seabed was allowed on the Dutch Continental Shelf (DCS). To guarantee sufficient supply and to decrease the surface area of direct impact, the Dutch

  19. Comparison of nine DNA extraction methods for the diagnosis of bovine tuberculosis by real time PCR

    OpenAIRE

    Moura, André; Hodon, Mikael Arrais; Soares Filho, Paulo Martins; Issa, Marina de Azevedo; Oliveira, Ana Paula Ferreira de; Fonseca Júnior, Antônio Augusto

    2016-01-01

    ABSTRACT: Bovine tuberculosis is an infectious disease with a high impact on the cattle industry, particularly in developing countries. PCR is a very sensitive method for detection of infectious agents, but the sensitivity of molecular diagnosis is largely dependent on the efficiency of the DNA extraction methods. The objective of this study was to evaluate DNA extraction methods for direct detection of Mycobacterium bovis in bovine tissue. Nine commercial kits for DNA extraction were evalua...

  20. RANWAR: rank-based weighted association rule mining from gene expression and methylation data.

    Science.gov (United States)

    Mallik, Saurav; Mukhopadhyay, Anirban; Maulik, Ujjwal

    2015-01-01

    Ranking of association rules is currently an interesting topic in data mining and bioinformatics. The huge number of evolved rules of items (or, genes) by association rule mining (ARM) algorithms makes confusion to the decision maker. In this article, we propose a weighted rule-mining technique (say, RANWAR or rank-based weighted association rule-mining) to rank the rules using two novel rule-interestingness measures, viz., rank-based weighted condensed support (wcs) and weighted condensed confidence (wcc) measures to bypass the problem. These measures are basically depended on the rank of items (genes). Using the rank, we assign weight to each item. RANWAR generates much less number of frequent itemsets than the state-of-the-art association rule mining algorithms. Thus, it saves time of execution of the algorithm. We run RANWAR on gene expression and methylation datasets. The genes of the top rules are biologically validated by Gene Ontologies (GOs) and KEGG pathway analyses. Many top ranked rules extracted from RANWAR that hold poor ranks in traditional Apriori, are highly biologically significant to the related diseases. Finally, the top rules evolved from RANWAR, that are not in Apriori, are reported.

  1. Extraction optimization and UHPLC method development for determination of the 20-hydroxyecdysone in Sida tuberculata leaves.

    Science.gov (United States)

    da Rosa, Hemerson S; Koetz, Mariana; Santos, Marí Castro; Jandrey, Elisa Helena Farias; Folmer, Vanderlei; Henriques, Amélia Teresinha; Mendez, Andreas Sebastian Loureiro

    2018-04-01

    Sida tuberculata (ST) is a Malvaceae species widely distributed in Southern Brazil. In traditional medicine, ST has been employed as hypoglycemic, hypocholesterolemic, anti-inflammatory and antimicrobial. Additionally, this species is chemically characterized by flavonoids, alkaloids and phytoecdysteroids mainly. The present work aimed to optimize the extractive technique and to validate an UHPLC method for the determination of 20-hydroxyecdsone (20HE) in the ST leaves. Box-Behnken Design (BBD) was used in method optimization. The extractive methods tested were: static and dynamic maceration, ultrasound, ultra-turrax and reflux. In the Box-Behnken three parameters were evaluated in three levels (-1, 0, +1), particle size, time and plant:solvent ratio. In validation method, the parameters of selectivity, specificity, linearity, limits of detection and quantification (LOD, LOQ), precision, accuracy and robustness were evaluated. The results indicate static maceration as better technique to obtain 20HE peak area in ST extract. The optimal extraction from surface response methodology was achieved with the parameters granulometry of 710 nm, 9 days of maceration and plant:solvent ratio 1:54 (w/v). The UHPLC-PDA analytical developed method showed full viability of performance, proving to be selective, linear, precise, accurate and robust for 20HE detection in ST leaves. The average content of 20HE was 0.56% per dry extract. Thus, the optimization of extractive method in ST leaves increased the concentration of 20HE in crude extract, and a reliable method was successfully developed according to validation requirements and in agreement with current legislation. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa1

    Science.gov (United States)

    Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.

    2017-01-01

    Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)–based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer’s protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. Results: The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. Discussion: The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation. PMID:28224056

  3. Comparative efficiency of different methods of gluten extraction in indigenous varieties of wheat

    OpenAIRE

    Imran, Samra; Hussain, Zaib; Ghafoor, Farkhanda; Ahmad Nagra, Saeed; Ashbeal Ziai, Naheeda

    2013-01-01

    The present study investigated six varieties of locally grown wheat (Lasani, Sehar, Miraj-08, Chakwal-50, Faisalabad-08 and Inqlab) procured from Punjab Seed Corporation, Lahore, Pakistan for their proximate contents. On the basis of protein content and ready availability, Faisalabad-08 (FD-08) was selected to be used for the assessment of comparative efficiency of various methods used for gluten extraction. Three methods, mechanical, chemical and microbiological were used for the extraction ...

  4. Rapid Solid-Liquid Dynamic Extraction (RSLDE): a New Rapid and Greener Method for Extracting Two Steviol Glycosides (Stevioside and Rebaudioside A) from Stevia Leaves.

    Science.gov (United States)

    Gallo, Monica; Vitulano, Manuela; Andolfi, Anna; DellaGreca, Marina; Conte, Esterina; Ciaravolo, Martina; Naviglio, Daniele

    2017-06-01

    Stevioside and rebaudioside A are the main diterpene glycosides present in the leaves of the Stevia rebaudiana plant, which is used in the production of foods and low-calorie beverages. The difficulties associated with their extraction and purification are currently a problem for the food processing industries. The objective of this study was to develop an effective and economically viable method to obtain a high-quality product while trying to overcome the disadvantages derived from the conventional transformation processes. For this reason, extractions were carried out using a conventional maceration (CM) and a cyclically pressurized extraction known as rapid solid-liquid dynamic extraction (RSLDE) by the Naviglio extractor (NE). After only 20 min of extraction using the NE, a quantity of rebaudioside A and stevioside equal to 1197.8 and 413.6 mg/L was obtained, respectively, while for the CM, the optimum time was 90 min. From the results, it can be stated that the extraction process by NE and its subsequent purification developed in this study is a simple, economical, environmentally friendly method for producing steviol glycosides. Therefore, this method constitutes a valid alternative to conventional extraction by reducing the extraction time and the consumption of toxic solvents and favouring the use of the extracted metabolites as food additives and/or nutraceuticals. As an added value and of local interest, the experiment was carried out on stevia leaves from the Benevento area (Italy), where a high content of rebaudioside A was observed, which exhibits a sweet taste compared to stevioside, which has a significant bitter aftertaste.

  5. Rapid and efficient method to extract metagenomic DNA from estuarine sediments.

    Science.gov (United States)

    Shamim, Kashif; Sharma, Jaya; Dubey, Santosh Kumar

    2017-07-01

    Metagenomic DNA from sediments of selective estuaries of Goa, India was extracted using a simple, fast, efficient and environment friendly method. The recovery of pure metagenomic DNA from our method was significantly high as compared to other well-known methods since the concentration of recovered metagenomic DNA ranged from 1185.1 to 4579.7 µg/g of sediment. The purity of metagenomic DNA was also considerably high as the ratio of absorbance at 260 and 280 nm ranged from 1.88 to 1.94. Therefore, the recovered metagenomic DNA was directly used to perform various molecular biology experiments viz. restriction digestion, PCR amplification, cloning and metagenomic library construction. This clearly proved that our protocol for metagenomic DNA extraction using silica gel efficiently removed the contaminants and prevented shearing of the metagenomic DNA. Thus, this modified method can be used to recover pure metagenomic DNA from various estuarine sediments in a rapid, efficient and eco-friendly manner.

  6. Integrated Case Based and Rule Based Reasoning for Decision Support

    OpenAIRE

    Eshete, Azeb Bekele

    2009-01-01

    This project is a continuation of my specialization project which was focused on studying theoretical concepts related to case based reasoning method, rule based reasoning method and integration of them. The integration of rule-based and case-based reasoning methods has shown a substantial improvement with regards to performance over the individual methods. Verdande Technology As wants to try integrating the rule based reasoning method with an existing case based system. This project focu...

  7. A RAPID DNA EXTRACTION METHOD FOR PCR IDENTIFICATION OF FUNGAL INDOOR AIR CONTAMINANTS

    Science.gov (United States)

    Following air sampling, fungal DNA needs to be extracted and purified to a state suitable for laboratory use. Our laboratory has developed a simple method of extraction and purification of fungal DNA appropriate for enzymatic manipulation and polymerase chain reaction (PCR) appli...

  8. Ecosystem-based design rules for marine sand extraction sites

    NARCIS (Netherlands)

    Jong, de Maarten F.; Borsje, Bas W.; Baptist, Martin J.; Wal, van der Jan Tjalling; Lindeboom, Han J.; Hoekstra, Piet

    2016-01-01

    The demand for marine sand in the Netherlands as well as globally is increasing. Over the last decades, only shallow sand extraction of 2m below the seabed was allowed on the Dutch Continental Shelf (DCS). To guarantee sufficient supply and to decrease the surface area of direct impact, the Dutch

  9. An Association Rule Based Method to Integrate Metro-Public Bicycle Smart Card Data for Trip Chain Analysis

    Directory of Open Access Journals (Sweden)

    De Zhao

    2018-01-01

    Full Text Available Smart card data provide valuable insights and massive samples for enhancing the understanding of transfer behavior between metro and public bicycle. However, smart cards for metro and public bicycle are often issued and managed by independent companies and this results in the same commuter having different identity tags in the metro and public bicycle smart card systems. The primary objective of this study is to develop a data fusion methodology for matching metro and public bicycle smart cards for the same commuter using historical smart card data. A novel method with association rules to match the data derived from the two systems is proposed and validation was performed. The results showed that our proposed method successfully matched 573 pairs of smart cards with an accuracy of 100%. We also validated the association rules method through visualization of individual metro and public bicycle trips. Based on the matched cards, interesting findings of metro-bicycle transfer have been derived, including the spatial pattern of the public bicycle as first/last mile solution as well as the duration of a metro trip chain.

  10. Review on the Extraction Methods of Crude oil from all Generation Biofuels in last few Decades

    Science.gov (United States)

    Bhargavi, G.; Nageswara Rao, P.; Renganathan, S.

    2018-03-01

    The ever growing demand for the energy fuels, economy of oil, depletion of energy resources and environmental protection are the inevitable challenges required to be solved meticulously in future decades in order to sustain the life of humans and other creatures. Switching to alternate fuels that are renewable, biodegradable, economically and environmentally friendly can quench the minimum thirst of fuel demands, in addition to mitigation of climate changes. At this moment, production of biofuels has got prominence. The term biofuels broadly refer to the fuels derived from living matter either animals or plants. Among the competent biofuels, biodiesel is one of the promising alternates for diesel engines. Biodiesel is renewable, environmentally friendly, safe to use with wide applications and biodegradable. Due to which, it has become a major focus of intensive global research and development of alternate energy. The present review has been focused specifically on biodiesel. Concerning to the biodiesel production, the major steps includes lipid extraction followed by esterification/transesterification. For the extraction of lipids, several extraction techniques have been put forward irrespective of the generations and feed stocks used. This review provides theoretical background on the two major extraction methods, mechanical and chemical extraction methods. The practical issues of each extraction method such as efficiency of extraction, extraction time, oil sources and its pros and cons are discussed. It is conceived that congregating information on oil extraction methods may helpful in further research advancements to ease biofuel production.

  11. Simple, rapid and cost-effective method for high quality nucleic acids extraction from different strains of Botryococcus braunii.

    Directory of Open Access Journals (Sweden)

    Byung-Hyuk Kim

    Full Text Available This study deals with an effective nucleic acids extraction method from various strains of Botryococcus braunii which possesses an extensive extracellular matrix. A method combining freeze/thaw and bead-beating with heterogeneous diameter of silica/zirconia beads was optimized to isolate DNA and RNA from microalgae, especially from B. braunii. Eukaryotic Microalgal Nucleic Acids Extraction (EMNE method developed in this study showed at least 300 times higher DNA yield in all strains of B. braunii with high integrity and 50 times reduced working volume compared to commercially available DNA extraction kits. High quality RNA was also extracted using this method and more than two times the yield compared to existing methods. Real-time experiments confirmed the quality and quantity of the input DNA and RNA extracted using EMNE method. The method was also applied to other eukaryotic microalgae, such as diatoms, Chlamydomonas sp., Chlorella sp., and Scenedesmus sp. resulting in higher efficiencies. Cost-effectiveness analysis of DNA extraction by various methods revealed that EMNE method was superior to commercial kits and other reported methods by >15%. This method would immensely contribute to area of microalgal genomics.

  12. A small-scale, portable method for extracting microplastics from marine sediments.

    Science.gov (United States)

    Coppock, Rachel L; Cole, Matthew; Lindeque, Penelope K; Queirós, Ana M; Galloway, Tamara S

    2017-11-01

    Microplastics (plastic particles, 0.1 μm-5 mm in size) are widespread marine pollutants, accumulating in benthic sediments and shorelines the world over. To gain a clearer understanding of microplastic availability to marine life, and the risks they pose to the health of benthic communities, ecological processes and food security, it is important to obtain accurate measures of microplastic abundance in marine sediments. To date, methods for extracting microplastics from marine sediments have been disadvantaged by complexity, expense, low extraction efficiencies and incompatibility with very fine sediments. Here we present a new, portable method to separate microplastics from sediments of differing types, using the principle of density floatation. The Sediment-Microplastic Isolation (SMI) unit is a custom-built apparatus which consistently extracted microplastics from sediments in a single step, with a mean efficiency of 95.8% (±SE 1.6%; min 70%, max 100%). Zinc chloride, at a density of 1.5 g cm -3 , was deemed an effective and relatively inexpensive floatation media, allowing fine sediment to settle whilst simultaneously enabling floatation of dense polymers. The method was validated by artificially spiking sediment with low and high density microplastics, and its environmental relevance was further tested by extracting plastics present in natural sediment samples from sites ranging in sediment type; fine silt/clay (mean size 10.25 ± SD 3.02 μm) to coarse sand (mean size 149.3 ± SD 49.9 μm). The method presented here is cheap, reproducible and is easily portable, lending itself for use in the laboratory and in the field, eg. on board research vessels. By employing this method, accurate estimates of microplastic type, distribution and abundance in natural sediments can be achieved, with the potential to further our understanding of the availability of microplastics to benthic organisms. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All

  13. Revisiting chlorophyll extraction methods in biological soil crusts - methodology for determination of chlorophyll a and chlorophyll a + b as compared to previous methods

    Science.gov (United States)

    Caesar, Jennifer; Tamm, Alexandra; Ruckteschler, Nina; Lena Leifke, Anna; Weber, Bettina

    2018-03-01

    Chlorophyll concentrations of biological soil crust (biocrust) samples are commonly determined to quantify the relevance of photosynthetically active organisms within these surface soil communities. Whereas chlorophyll extraction methods for freshwater algae and leaf tissues of vascular plants are well established, there is still some uncertainty regarding the optimal extraction method for biocrusts, where organism composition is highly variable and samples comprise major amounts of soil. In this study we analyzed the efficiency of two different chlorophyll extraction solvents, the effect of grinding the soil samples prior to the extraction procedure, and the impact of shaking as an intermediate step during extraction. The analyses were conducted on four different types of biocrusts. Our results show that for all biocrust types chlorophyll contents obtained with ethanol were significantly lower than those obtained using dimethyl sulfoxide (DMSO) as a solvent. Grinding of biocrust samples prior to analysis caused a highly significant decrease in chlorophyll content for green algal lichen- and cyanolichen-dominated biocrusts, and a tendency towards lower values for moss- and algae-dominated biocrusts. Shaking of the samples after each extraction step had a significant positive effect on the chlorophyll content of green algal lichen- and cyanolichen-dominated biocrusts. Based on our results we confirm a DMSO-based chlorophyll extraction method without grinding pretreatment and suggest the addition of an intermediate shaking step for complete chlorophyll extraction (see Supplement S6 for detailed manual). Determination of a universal chlorophyll extraction method for biocrusts is essential for the inter-comparability of publications conducted across all continents.

  14. Comparison between Different Extraction Methods for Determination of Primary Aromatic Amines in Food Simulant

    Directory of Open Access Journals (Sweden)

    Morteza Shahrestani

    2018-01-01

    Full Text Available The primary aromatic amines (PAAs are food contaminants which may exist in packaged food. Polyurethane (PU adhesives which are used in flexible packaging are the main source of PAAs. It is the unreacted diisocyanates which in fact migrate to foodstuff and then hydrolyze to PAAs. These PAAs include toluenediamines (TDAs and methylenedianilines (MDAs, and the selected PAAs were 2,4-TDA, 2,6-TDA, 4,4′-MDA, 2,4′-MDA, and 2,2′-MDA. PAAs have genotoxic, carcinogenic, and allergenic effects. In this study, extraction methods were applied on a 3% acetic acid as food simulant which was spiked with the PAAs under study. Extraction methods were liquid-liquid extraction (LLE, dispersive liquid-liquid microextraction (DLLME, and solid-phase extraction (SPE with C18 ec (octadecyl, HR-P (styrene/divinylbenzene, and SCX (strong cationic exchange cartridges. Extracted samples were detected and analyzed by HPLC-UV. In comparison between methods, recovery rate of SCX cartridge showed the best adsorption, up to 91% for polar PAAs (TDAs and MDAs. The interested PAAs are polar and relatively soluble in water, so a cartridge with cationic exchange properties has the best absorption and consequently the best recoveries.

  15. Gold-plated mode of CP-violation in decays of B{sub c} meson from QCD sum rules

    Energy Technology Data Exchange (ETDEWEB)

    Kiselev, V V [Russian State Research Center, ' Institute for High Energy Physics' , Protvino, Moscow Region, 142281 (Russian Federation)

    2004-10-01

    A model-independent method based on the triangle ideology is implemented to extract the CKM-matrix angle {gamma} in the decays of the doubly heavy long-lived meson B{sub c}. We analyse a colour structure of diagrams and conditions to reconstruct two reference triangles by tagging the flavour and CP eigenstates of D{sup 0} - D{sup 0}bar mesons in the fixed exclusive channels. The characteristic branching ratios are evaluated in the framework of QCD sum rules.

  16. Prediction of methylmercury accumulation in rice grains by chemical extraction methods

    International Nuclear Information System (INIS)

    Zhu, Dai-Wen; Zhong, Huan; Zeng, Qi-Long; Yin, Ying

    2015-01-01

    To explore the possibility of using chemical extraction methods to predict phytoavailability/bioaccumulation of soil-bound MeHg, MeHg extractions by three widely-used extractants (CaCl 2 , DTPA, and (NH 4 ) 2 S 2 O 3 ) were compared with MeHg accumulation in rice grains. Despite of variations in characteristics of different soils, MeHg extracted by (NH 4 ) 2 S 2 O 3 (highly affinitive to MeHg) correlated well with grain MeHg levels. Thus (NH 4 ) 2 S 2 O 3 extraction, solubilizing not only weakly-bound and but also strongly-bound MeHg, may provide a measure of ‘phytoavailable MeHg pool’ for rice plants. Besides, a better prediction of grain MeHg levels was obtained when growing condition of rice plants was also considered. However, MeHg extracted by CaCl 2 or DTPA, possibly quantifying ‘exchangeable MeHg pool’ or ‘weakly-complexed MeHg pool’ in soils, may not indicate phytoavailable MeHg or predict grain MeHg levels. Our results provided the possibility of predicting MeHg phytoavailability/bioaccumulation by (NH 4 ) 2 S 2 O 3 extraction, which could be useful in screening soils for rice cultivation in contaminated areas. - Highlights: • MeHg extraction by (NH 4 ) 2 S 2 O 3 correlates well with its accumulation in rice grains. • MeHg extraction by (NH 4 ) 2 S 2 O 3 provides a measure of phytoavailable MeHg in soils. • Some strongly-bound MeHg could be desorbed from soils and available to rice plants. • MeHg extraction by CaCl 2 or DTPA could not predict grain MeHg levels. - Methylmercury extraction from soils by (NH 4 ) 2 S 2 O 3 could possibly be used for predicting methylmercury phytoavailability and its bioaccumulation in rice grains

  17. A Rapid and Cost-Effective Method for DNA Extraction from Archival Herbarium Specimens.

    Science.gov (United States)

    Krinitsina, A A; Sizova, T V; Zaika, M A; Speranskaya, A S; Sukhorukov, A P

    2015-11-01

    Here we report a rapid and cost-effective method for the extraction of total DNA from herbarium specimens up to 50-90-year-old. The method takes about 2 h, uses AMPure XP magnetic beads diluted by PEG-8000- containing buffer, and does not require use of traditional volatile components like chloroform, phenol, and liquid nitrogen. It yields up to 4 µg of total nucleic acid with high purity from about 30 mg of dry material. The quality of the extracted DNA was tested by PCR amplification of 5S rRNA and rbcL genes (nuclear and chloroplast DNA markers) and compared against the traditional chloroform/isoamyl alcohol method. Our results demonstrate that the use of the magnetic beads is crucial for extraction of DNA suitable for subsequent PCR from herbarium samples due to the decreasing inhibitor concentrations, reducing short fragments of degraded DNA, and increasing median DNA fragment sizes.

  18. Evaluating the efficacy of a centrifugation-flotation method for extracting Ascaris ova from soil

    DEFF Research Database (Denmark)

    Cranston, Imogen; Teoh, Penelope J.; baker, Sarah M.

    2016-01-01

    method to extract STH ova from soil makes it challenging to examine whether the use of latrines may or may not have an effect on environmental contamination with ova. The present study evaluated the recovery rate of a method developed to extract STH ova from soil. Methods: The adapted centrifugation...... with increasing soil moisture content, particle size and organic matter content. The association between recovery rate and organic matter content was statistically significant. Conclusions: The present study identified a low recovery rate for an adapted centrifugation-flotation method, although this was similar...

  19. The effects of three different grinding methods in DNA extraction of ...

    African Journals Online (AJOL)

    uwerhiavwe

    2013-04-17

    Apr 17, 2013 ... The effects of three different grinding methods in DNA extraction of cowpea .... 100 mg of the leaf tissues were weighed in an electronic balance. CTAB method .... The primers were synthesized by Life Technologies (AB & Invitrogen). .... This work was supported by 'Shuang-Zhi Plan' of Sichuan. Agricultural ...

  20. Development and Validation of a HPLC-UV Method for Extraction Optimization and Biological Evaluation of Hot-Water and Ethanolic Extracts of Dendropanax morbifera Leaves

    Directory of Open Access Journals (Sweden)

    Hyung-Jae Choi

    2018-03-01

    Full Text Available Dendropanax morbifera Leveille (Araliaceae has been used in traditional oriental remedies for cancer, inflammation, diabetes, and thrombosis. However, a validated analytical method, standardization, and optimization of extraction conditions with respect to biological activity have not been reported. In this study, a simple and validated HPLC method for identifying and quantifying active substances in D. morbifera was developed. Hot water and ethanolic D. morbifera leaf extracts from different production regions were prepared and evaluated with regard to their chemical compositions and biological activities. The contents of active compounds such as rutin and chlorogenic acid were determined in four samples collected from different regions. The 80% ethanolic extract showed the best antioxidant activity, phenolic content, reducing power, and xanthine oxidase (XO inhibitory activity. The validated HPLC method confirmed the presence of chlorogenic acid and rutin in D. morbifera leaf extracts. The antioxidant and XO inhibitory activity of D. morbifera extract could be attributed to the marker compounds. Collectively, these results suggest that D. morbifera leaves could be beneficial for the treatment or prevention of hyperuricemia-related disease, and the validated HPLC method could be a useful tool for the quality control of food or drug formulations containing D. morbifera.

  1. An improved facile method for extraction and determination of steroidal saponins in Tribulus terrestris by focused microwave-assisted extraction coupled with GC-MS.

    Science.gov (United States)

    Li, Tianlin; Zhang, Zhuomin; Zhang, Lan; Huang, Xinjian; Lin, Junwei; Chen, Guonan

    2009-12-01

    An improved fast method for extraction of steroidal saponins in Tribulus terrestris based on the use of focus microwave-assisted extraction (FMAE) is proposed. Under optimized conditions, four steroidal saponins were extracted from Tribulus terrestris and identified by GC-MS, which are Tigogenin (TG), Gitogenin (GG), Hecogenin (HG) and Neohecogenin (NG). One of the most important steroidal saponins, namely TG was quantified finally. The recovery of TG was in the range of 86.7-91.9% with RSDTribulus terrestris from different areas of occurrence. The difference in chromatographic characteristics of steroidal saponins was proved to be related to the different areas of occurrence. The results showed that FMAE-GC-MS is a simple, rapid, solvent-saving method for the extraction and determination of steroidal saponins in Tribulus terrestris.

  2. Ultrasound assisted methods for enhanced extraction of phycobiliproteins from marine macro-algae, Gelidium pusillum (Rhodophyta).

    Science.gov (United States)

    Mittal, Rochak; Tavanandi, Hrishikesh A; Mantri, Vaibhav A; Raghavarao, K S M S

    2017-09-01

    Extraction of phycobiliproteins (R-phycoerythrin, R-PE and R-phycocyanin, R-PC) from macro-algae is difficult due to the presence of large polysaccharides (agar, cellulose etc.) present in the cell wall which offer major hindrance for cell disruption. The present study is aimed at developing most suitable methodology for the primary extraction of R-PE and R-PC from marine macro-algae, Gelidium pusillum(Stackhouse) Le Jolis. Such extraction of phycobiliproteins by using ultrasonication and other conventional methods such as maceration, maceration in presence of liquid nitrogen, homogenization, and freezing and thawing (alone and in combinations) is reported for the first time. Standardization of ultrasonication for different parameters such as ultrasonication amplitude (60, 90 and 120µm) and ultrasonication time (1, 2, 4, 6, 8 and 10mins) at different temperatures (30, 35 and 40°C) was carried out. Kinetic parameters were estimated for extraction of phycobiliproteins by ultrasonication based on second order mass transfer kinetics. Based on calorimetric measurements, power, ultrasound intensity and acoustic power density were estimated to be 41.97W, 14.81W/cm 2 and 0.419W/cm 3 , respectively. Synergistic effect of ultrasonication was observed when employed in combination with other conventional primary extraction methods. Homogenization in combination with ultrasonication resulted in an enhancement in efficiency by 9.3% over homogenization alone. Similarly, maceration in combination with ultrasonication resulted in an enhancement in efficiency by 31% over maceration alone. Among all the methods employed, maceration in combination with ultrasonication resulted in the highest extraction efficiency of 77 and 93% for R-PE and R-PC, respectively followed by homogenization in combination with ultrasonication (69.6% for R-PE and 74.1% for R-PC). HPLC analysis was carried out in order to ensure that R-PE was present in the extract and remained intact even after processing

  3. A bayesian approach to QCD sum rules

    International Nuclear Information System (INIS)

    Gubler, Philipp; Oka, Makoto

    2010-01-01

    QCD sum rules are analyzed with the help of the Maximum Entropy Method. We develop a new technique based on the Bayesion inference theory, which allows us to directly obtain the spectral function of a given correlator from the results of the operator product expansion given in the deep euclidean 4-momentum region. The most important advantage of this approach is that one does not have to make any a priori assumptions about the functional form of the spectral function, such as the 'pole + continuum' ansatz that has been widely used in QCD sum rule studies, but only needs to specify the asymptotic values of the spectral function at high and low energies as an input. As a first test of the applicability of this method, we have analyzed the sum rules of the ρ-meson, a case where the sum rules are known to work well. Our results show a clear peak structure in the region of the experimental mass of the ρ-meson. We thus demonstrate that the Maximum Entropy Method is successfully applied and that it is an efficient tool in the analysis of QCD sum rules. (author)

  4. A New Method for Weak Fault Feature Extraction Based on Improved MED

    Directory of Open Access Journals (Sweden)

    Junlin Li

    2018-01-01

    Full Text Available Because of the characteristics of weak signal and strong noise, the low-speed vibration signal fault feature extraction has been a hot spot and difficult problem in the field of equipment fault diagnosis. Moreover, the traditional minimum entropy deconvolution (MED method has been proved to be used to detect such fault signals. The MED uses objective function method to design the filter coefficient, and the appropriate threshold value should be set in the calculation process to achieve the optimal iteration effect. It should be pointed out that the improper setting of the threshold will cause the target function to be recalculated, and the resulting error will eventually affect the distortion of the target function in the background of strong noise. This paper presents an improved MED based method of fault feature extraction from rolling bearing vibration signals that originate in high noise environments. The method uses the shuffled frog leaping algorithm (SFLA, finds the set of optimal filter coefficients, and eventually avoids the artificial error influence of selecting threshold parameter. Therefore, the fault bearing under the two rotating speeds of 60 rpm and 70 rpm is selected for verification with typical low-speed fault bearing as the research object; the results show that SFLA-MED extracts more obvious bearings and has a higher signal-to-noise ratio than the prior MED method.

  5. Methods of Information Subjects and Objects Interaction Rules Formalization in the Electronic Trading Platform System

    Directory of Open Access Journals (Sweden)

    Emma Emanuilova Yandybaeva

    2015-03-01

    Full Text Available The methods of information subjects and objects interaction rules formalization in the electronic trading platform system has been developed. They are based on mathematical model of mandatory role-based access control. As a result of the work we have defined set of user roles and constructed roles hierarchy. For the roles hierarchy restrictions have been imposed to ensure the safety of the information system.

  6. Short-term optimal operation of Three-gorge and Gezhouba cascade hydropower stations in non-flood season with operation rules from data mining

    International Nuclear Information System (INIS)

    Ma Chao; Lian Jijian; Wang Junna

    2013-01-01

    Highlights: ► Short-term optimal operation of Three-gorge and Gezhouba hydropower stations was studied. ► Key state variable and exact constraints were proposed to improve numerical model. ► Operation rules proposed were applied in population initiation step for faster optimization. ► Culture algorithm with difference evolution was selected as optimization method. ► Model and method proposed were verified by case study with feasible operation solutions. - Abstract: Information hidden in the characteristics and relationship data of a cascade hydropower stations can be extracted by data-mining approaches to be operation rules and optimization support information. In this paper, with Three-gorge and Gezhouba cascade hydropower stations as an example, two operation rules are proposed due to different operation efficiency of water turbines and tight water volume and hydraulic relationship between two hydropower stations. The rules are applied to improve optimization model with more exact decision and state variables and constraints. They are also used in the population initiation step to develop better individuals with culture algorithm with differential evolution as an optimization method. In the case study, total feasible population and the best solution based on an initial population with an operation rule can be obtained with a shorter computation time than that of a pure random initiated population. Amount of electricity generation in a dispatch period with an operation rule also increases with an average increase rate of 0.025%. For a fixed water discharge process of Three-gorge hydropower station, there is a better rule to decide an operation plan of Gezhouba hydropower station in which total hydraulic head for electricity generation is optimized and distributed with inner-plant economic operation considered.

  7. Comparison of extraction methods for the analysis of natural dyes in historical textiles by high-performance liquid chromatography.

    Science.gov (United States)

    Valianou, Lemonia; Karapanagiotis, Ioannis; Chryssoulakis, Yannis

    2009-12-01

    Different methods for the extraction of Dactylopius coccus Costa, Rubia tinctorum L., Isatis tinctoria L., Reseda luteola L., Curcuma longa L. and Cotinus coggygria Scop. from wool fibres are investigated using high-performance liquid chromatography with diode array detector (HPLC-DAD). The efficiencies of five extraction methods which include the use of HCl (widely used extraction method), citric acid, oxalic acid, TFA and a combination of HCOOH and EDTA are compared on the basis of the (a) number, (b) relative quantities, measured as HPLC peak areas and (c) signal-to-noise ratios (S/N) of the compounds extracted from the wool substrates. Flavonoid glycosides and curcuminoids contained in R. luteola L. and C. longa L., respectively, according to liquid chromatography with mass spectrometry (LC-MS) identifications, are not detected after treating the fibres with HCl. All the other milder methods are successful in extracting these compounds. Experiments are performed using HPLC-DAD to compare the HPLC peak areas and the S/N of the following extracted compounds: indigotin, indirubin, curcumin, demethoxycurcumin, bisdemethoxycurcumin, fisetin, sulfuretin, luteolin, luteolin-7-O-glucoside, apigenin, carminic acid, alizarin, puruprin and rubiadin. It is shown that the TFA method provides overall the best results as it gives elevated extraction yields except for fisetin, luteolin, apigenin and luteolin-7-O-glucoside and highest S/N except for fisetin and luteolin-7-O-glucoside. It is noteworthy that treatment of the fibres with the typical HCl extraction method results overall in very low S/N. The TFA method is selected for further studies, as follows. First, it is applied on silk dyed samples and compared with the HCl method. The same relative differences of the TFA and HCl methods observed for the wool dyed samples are reported for the silk dyed samples too, except for rubiadin, luteolin and apigenin. Thus, in most cases, the nature of the substrate (wool or silk

  8. Comparison of nine DNA extraction methods for the diagnosis of bovine tuberculosis by real time PCR

    Directory of Open Access Journals (Sweden)

    André Moura

    2016-07-01

    Full Text Available ABSTRACT: Bovine tuberculosis is an infectious disease with a high impact on the cattle industry, particularly in developing countries. PCR is a very sensitive method for detection of infectious agents, but the sensitivity of molecular diagnosis is largely dependent on the efficiency of the DNA extraction methods. The objective of this study was to evaluate DNA extraction methods for direct detection of Mycobacterium bovis in bovine tissue. Nine commercial kits for DNA extraction were evaluated when combined with two real time PCRs. The DNeasy Blood & Tissue Kit from QIAGEN showed better performance and sensitivity followed by the DNA Mini Kit RBC and FTA Elute Micro Card. Results suggested that, even when the analytical sensitivity of the qPCR is very high, the extraction method can influence the diagnostic sensitivity.

  9. Comparative efficiency of different methods of gluten extraction in indigenous varieties of wheat.

    Science.gov (United States)

    Imran, Samra; Hussain, Zaib; Ghafoor, Farkhanda; Nagra, Saeedahmad; Ziai, Naheeda Ashbeal

    2013-06-01

    The present study investigated six varieties of locally grown wheat (Lasani, Sehar, Miraj-08, Chakwal-50, Faisalabad-08 and Inqlab) procured from Punjab Seed Corporation, Lahore, Pakistan for their proximate contents. On the basis of protein content and ready availability, Faisalabad-08 (FD-08) was selected to be used for the assessment of comparative efficiency of various methods used for gluten extraction. Three methods, mechanical, chemical and microbiological were used for the extraction of gluten from FD-08. Each method was carried out under ambient conditions using a drying temperature of 55 degrees C. Mechanical method utilized four different processes viz:- dough process, dough batter process, batter process and ethanol washing process using standard 150 mesh. The starch thus obtained was analyzed for its proximate contents. Dough batter process proved to be the most efficient mechanical method and was further investigated using 200 and 300 mesh. Gluten content was determined using sandwich omega-gliadin enzyme-linked immunosorbent assay (ELISA).The results of dough batter process using 200 mesh indicated a starch product with gluten content of 678 ppm. Chemical method indicated high gluten content of more than 5000 ppm and the microbiological method reduced the gluten content from 2500 ppm to 398 ppm. From the results it was observed that no gluten extraction method is viable to produce starch which can fulfill the criteria of a gluten free product (20 ppm).

  10. A method for the solvent extraction of low-boiling-point plant volatiles.

    Science.gov (United States)

    Xu, Ning; Gruber, Margaret; Westcott, Neil; Soroka, Julie; Parkin, Isobel; Hegedus, Dwayne

    2005-01-01

    A new method has been developed for the extraction of volatiles from plant materials and tested on seedling tissue and mature leaves of Arabidopsis thaliana, pine needles and commercial mixtures of plant volatiles. Volatiles were extracted with n-pentane and then subjected to quick distillation at a moderate temperature. Under these conditions, compounds such as pigments, waxes and non-volatile compounds remained undistilled, while short-chain volatile compounds were distilled into a receiving flask using a high-efficiency condenser. Removal of the n-pentane and concentration of the volatiles in the receiving flask was carried out using a Vigreux column condenser prior to GC-MS. The method is ideal for the rapid extraction of low-boiling-point volatiles from small amounts of plant material, such as is required when conducting metabolic profiling or defining biological properties of volatile components from large numbers of mutant lines.

  11. A simple digestion method with a Lefort aqua regia solution for diatom extraction.

    Science.gov (United States)

    Wang, Huipin; Liu, Yan; Zhao, Jian; Hu, Sunlin; Wang, Yuzhong; Liu, Chao; Zhang, Yanji

    2015-01-01

    Presence of diatoms in tissues has been considered as a significant sign of drowning. However, there are limitations in the present extraction methods. We developed a new digestion method using the Lefort aqua regia solution (3:1 nitric acid to hydrochloric acid) for diatom extraction and evaluated the digestive capability, diatom destruction, and diatoms' recovery of this new method. The kidney tissues from rabbit mixed with water rich in diatoms were treated by the Lefort aqua regia digestion method (n = 10) and the conventional acid digestion method (n = 10). The results showed that the digestive capability of Lefort aqua regia digestion method was superior to conventional acid digestion method (p 0.05). The Lefort aqua regia reagent is an improvement over the conventional acid digestion for recovery of diatoms from tissue samples. © 2014 American Academy of Forensic Sciences.

  12. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    Zhang, Yaoxin; Jia, Yafei

    2018-01-01

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.

  13. Optimization and Comparison of Ultrasound-Assisted Extraction of Estragole from Tarragon Leaves with Hydro-Distillation Method

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Gholivand

    2014-12-01

    Full Text Available A comparative study of ultrasound-assisted extraction (UAE and hydro-distillation was performed for fast extraction of estragole from tarragon (Artemisia dracunculus L. dried leaves. Several influential parameters of the UAE procedure in the extraction of estragole (type of solvent, extraction cycles, solvent to material ratio, irradiation time and particle size were investigated and optimized. It was found that UAE offers a more rapid extraction of estragole than hydrodistillation. The optimum parameters were solvent to material ratio of 8:1 v/m, 96% (w/w ethanol in water as extraction solvent, particle size of 1.18 mm, irradiation time of 5 min, output power of 63 W, 9 pulses, and ultrasonic frequency of 20 kHz. The recovery of estragole by UAE under optimal conditions was 44.4% based on dry extract. The benefit of ultrasound was to decrease the extraction time (5 min relative to the classical hydrodistillation method (3 h. The experimental results also indicated that ultrasound-assisted extraction is a simple, rapid and effective method for extraction of the volatile oil components of tarragon.

  14. Evaluation of simplified dna extraction methods for EMM typing of group a streptococci

    Directory of Open Access Journals (Sweden)

    Jose JJM

    2006-01-01

    Full Text Available Simplified methods of DNA extraction for amplification and sequencing for emm typing of group A streptococci (GAS can save valuable time and cost in resource crunch situations. To evaluate this, we compared two methods of DNA extraction directly from colonies with the standard CDC cell lysate method for emm typing of 50 GAS strains isolated from children with pharyngitis and impetigo. For this, GAS colonies were transferred into two sets of PCR tubes. One set was preheated at 94oC for two minutes in the thermal cycler and cooled while the other set was frozen overnight at -20oC and then thawed before adding the PCR mix. For the cell lysate method, cells were treated with mutanolysin and hyaluronidase before heating at 100oC for 10 minutes and cooling immediately as recommended in the CDC method. All 50 strains could be typed by sequencing the hyper variable region of the emm gene after amplification. The quality of sequences and the emm types identified were also identical. Our study shows that the two simplified DNA extraction methods directly from colonies can conveniently be used for typing a large number of GAS strains easily in relatively short time.

  15. Rule based deterioration identification and management system

    International Nuclear Information System (INIS)

    Kataoka, S.; Pavinich, W.; Lapides, M.

    1993-01-01

    Under the sponsorship of IHI and EPRI, a rule-based screening system has been developed that can be used by utility engineers to determine which deterioration mechanisms are acting on specific LWR components, and to evaluate the efficacy of an age-related deterioration management program. The screening system was developed using the rule-based shell, NEXPERT, which provides traceability to the data sources used in the logic development. The system addresses all the deterioration mechanisms of specific metals encountered in either BWRs or PWRs. Deterioration mechanisms are listed with reasons why they may occur during the design life of LWRs, considering the plant environment, manufacturing process, service history, material chemical composition, etc. of components in a specific location of a LWR. To eliminate the evaluation of inactive deterioration quickly, a tier structure is applied to the rules. The reasons why deterioration will occur are extracted automatically by backward chaining. To reduce the amount of user input, plant environmental data are stored in files as default environmental data. (author)

  16. Optimization of ciguatoxin extraction method from blood for Pacific ciguatoxin (P-CTX-1).

    Science.gov (United States)

    Bottein Dechraoui, Marie-Yasmine; Wang, Zhihong; Ramsdell, John S

    2007-01-01

    Ciguatera diagnosis relies on clinical observations associated with a recent consumption of fish. Although needed, direct confirmation of exposure in subjects showing ciguatera disease symptoms is currently unavailable. We previously reported that ciguatoxins were measurable in the blood of mice exposed to extracts of Pacific ciguatoxins isolated from Gambierdiscus polynesiensis, and of Indian Ocean or Caribbean Sea ciguatoxins, isolated from fish. Although highly efficient for extracting spiked purified Caribbean-CTX-1, the methanolic extraction method previously described is found here to yield only 6% recovery of spiked Pacific-CTX-1 (P-CTX-1). We report in this short communication a substantially modified method for ciguatoxin extraction from both dried and fresh blood. With this method, toxin measurement is directly accomplished in acetonitrile deproteinated whole fresh blood or phosphate buffer solution (PBS) eluted dried blood using the N2A cell-based assay. Spike studies using increasing concentrations of purified ciguatoxins reveal linear (r2 above 0.87 for all toxins) and overall efficient toxin recoveries (62%, 96%, and 96% from fresh blood and 75%, 90%, and 74% from dried blood, for C-CTX-1, P-CTX-3C, and P-CTX-1, respectively). Comparative blood matrix analysis for P-CTX-1 recovery shows increased recovery of ciguatoxin activity from whole fresh blood than from dried blood, greater by 20% in P-CTX-1 spiked mice blood and by over 85% in P-CTX-1 exposed mouse blood. In conclusion, both Caribbean and Pacific ciguatoxins can be readily extracted from blood using this modified method; however, in the case of P-CTX-1 we find that fresh blood is optimal.

  17. Analytical methods and problems for the diamides type of extractants

    International Nuclear Information System (INIS)

    Cuillerdier, C.; Nigond, L.; Musikas, C.; Vitart, H.; Hoel, P.

    1989-01-01

    Diamides of carboxylic acids and especially malonamides are able to extract alpha emitters (including trivalent ions such as Am and Cm) contained in the wastes solutions of the nuclear industry. As they are completely incinerable and easy to purify, they could be an alternative to the mixture CMPO-TBP which is used in the TRUEX process. A large oxyalkyl radical enhances the distribution coefficients of americium in nitric acid sufficiently to permit the decontamination of wastes solutions in a classical mixers-settlers battery. Now researches are pursued with the aim of optimizing the formula of extractant, the influence of the structure of the extractant on its basicity and stability under radiolysis and hydrolysis is investigated. Analytical methods (potentiometry and NMR of C 13 ) have been developed for solvent titration and to evaluate the percentage of degradation and to identify some of the degradation products

  18. Microalgae based biorefinery: evaluation of oil extraction methods in terms of efficiency, costs, toxicity and energy in lab-scale

    Directory of Open Access Journals (Sweden)

    Ángel Darío González-Delgado

    2013-06-01

    Full Text Available Several alternatives of microalgal metabolites extraction and transformation are being studied for achieving the total utilization of this energy crop of great interest worldwide. Microalgae oil extraction is a key stage in microalgal biodiesel production chains and their efficiency affects significantly the global process efficiency. In this study, a comparison of five oil extraction methods in lab-scale was made taking as additional parameters, besides extraction efficiency, the costs of method performing, energy requirements, and toxicity of solvents used, in order to elucidate the convenience of their incorporation to a microalgae-based topology of biorefinery. Methods analyzed were Solvent extraction assisted with high speed homogenization (SHE, Continuous reflux solvent extraction (CSE, Hexane based extraction (HBE, Cyclohexane based extraction (CBE and Ethanol-hexane extraction (EHE, for this evaluation were used the microalgae strains Nannochloropsis sp., Guinardia sp., Closterium sp., Amphiprora sp. and Navicula sp., obtained from a Colombian microalgae bioprospecting. In addition, morphological response of strains to oil extraction methods was also evaluated by optic microscopy. Results shows that although there is not a unique oil extraction method which excels in all parameters evaluated, CSE, SHE and HBE appears as promising alternatives, while HBE method is shown as the more convenient for using in lab-scale and potentially scalable for implementation in a microalgae based biorefinery

  19. Automated detection of pain from facial expressions: a rule-based approach using AAM

    Science.gov (United States)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  20. The uranium waste fluid processing examination by liquid and liquid extraction method using the emulsion flow method

    International Nuclear Information System (INIS)

    Kanda, Nobuhiro; Daiten, Masaki; Endo, Yuji; Yoshida, Hideaki; Mita, Yutaka; Naganawa, Hirochika; Nagano, Tetsushi; Yanase, Nobuyuki

    2015-03-01

    Spent centrifuges which had used for the development of the uranium enrichment technology are stored in the uranium enrichment facility located in Ningyo-toge Environmental Center, Japan Atomic Energy Agency (JAEA). Our technology of the centrifugal machine processing are supposed to separate the radioactive material adhered on surface of inner parts of centrifuges by the wet way decontamination method using the ultrasonic bath filled dilute sulfuric acid and water, and it is generated the neutralization sediment (sludge) by the processing of the radioactive waste fluid with the decontamination. JAEA had been considering the applicability of a streamlining and reduction of the processing of the sludge by decreases radioactive concentration including the sludge through the removes uranium from the radioactive waste fluid. As part of considerations, JAEA have been promoting technological developments of the uranium extraction separation using The Emulsion Flow Extraction Method (a theory propounded by JAEA-Nuclear Science and Engineering Center) in close coordination and cooperation between with JAEA-Nuclear Science and Engineering Center and Ningyo-toge Environmental Center from 2007 fiscal year. This report describes the outline of the application test using actual waste fluid of dilute sulfuric acid and water by developed the examination system introducing the emulsion flow extraction method. (author)

  1. Automatic extraction of property norm-like data from large text corpora.

    Science.gov (United States)

    Kelly, Colin; Devereux, Barry; Korhonen, Anna

    2014-01-01

    Traditional methods for deriving property-based representations of concepts from text have focused on either extracting only a subset of possible relation types, such as hyponymy/hypernymy (e.g., car is-a vehicle) or meronymy/metonymy (e.g., car has wheels), or unspecified relations (e.g., car--petrol). We propose a system for the challenging task of automatic, large-scale acquisition of unconstrained, human-like property norms from large text corpora, and discuss the theoretical implications of such a system. We employ syntactic, semantic, and encyclopedic information to guide our extraction, yielding concept-relation-feature triples (e.g., car be fast, car require petrol, car cause pollution), which approximate property-based conceptual representations. Our novel method extracts candidate triples from parsed corpora (Wikipedia and the British National Corpus) using syntactically and grammatically motivated rules, then reweights triples with a linear combination of their frequency and four statistical metrics. We assess our system output in three ways: lexical comparison with norms derived from human-generated property norm data, direct evaluation by four human judges, and a semantic distance comparison with both WordNet similarity data and human-judged concept similarity ratings. Our system offers a viable and performant method of plausible triple extraction: Our lexical comparison shows comparable performance to the current state-of-the-art, while subsequent evaluations exhibit the human-like character of our generated properties.

  2. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Fast beam cut-off method in RF-knockout extraction for spot-scanning

    CERN Document Server

    Furukawa, T

    2002-01-01

    An irradiation method with magnetic scanning has been developed in order to provide accurate irradiation even for an irregular target shape. The scanning method has strongly required a lower ripple of the beam spill and a faster response to beam-on/off in slow extraction from a synchrotron ring. At HIMAC, RF-knockout extraction has utilized a bunched beam to reduce the beam-spill ripple. Therefore, particles near the resonance can be spilled out from the separatrices by synchrotron oscillation as well as by a transverse RF field. From this point of view, a fast beam cut-off method has been proposed and verified by both simulations and experiments. The maximum delay from the beam cut-off signal to beam-off has been improved to around 60 mu s from 700 mu s by a usual method. Unwanted dose has been considerably reduced by around a factor of 10 compared with that by the usual method.

  4. A Review of Extraction and Analytical Methods for the Determination of Tartrazine (E 102) in Foodstuffs.

    Science.gov (United States)

    Rovina, Kobun; Siddiquee, Shafiquzzaman; Shaarani, Sharifudin Md

    2017-07-04

    Tartrazine is an azo food dye, which is orange-colored and water soluble. It is usually used in foods, pharmaceuticals, cosmetics, and textiles. Tartrazine has the potential to cause an adverse health effect on humans, such as hyperactivity in children, allergy, and asthma. Joint FAO/WHO Expert Committee on Food Additive and EU Scientific Committee for Food have standardized the acceptable daily intake for tartrazine that is 7.5 mg kg -1 body weight. Many researchers have detected the presence of tartrazine for monitoring the quality and safety of food products. In this review paper, we highlighted various tartrazine detection and extraction methods. Some of the analytical methods are available such as high-performance liquid chromatography, electrochemical sensor, thin-layer chromatography, spectrophotometry, capillary electrophoresis, and liquid chromatography-tandem mass spectrometry. Also, we discuss following extraction steps: liquid-liquid extraction, solid-phase extraction, membrane filtration, cloud point extraction, and other extraction method. In addition, a brief overview is presented explaining the synthesis process and metabolism of tartrazine and the maximum permitted level in different countries. This review paper will give an insight into different extraction and analytical methods for the determination of tartrazine in healthy foods, which will attract the attention of public toward food safety and quality, and also the interest of food industry and government bodies.

  5. Methods for extraction and determination of phenolic acids in medicinal plants: a review.

    Science.gov (United States)

    Arceusz, Agnieszka; Wesolowski, Marek; Konieczynski, Pawel

    2013-12-01

    Phenolic acids constitute a group of potentially immunostimulating compounds. They occur in all medicinal plants and are widely used in phytotherapy and foods of plant origin. In recent years, phenolic acids have attracted much interest owing to their biological functions. This paper reviews the extraction and determination methods of phenolic acids in medicinal plants over the last 10 years. Although Soxhlet extraction and ultrasonic assisted extraction (UAE) are commonly used for the extraction of phenolic acids from plant materials, alternative techniques such as supercritical fluid extraction (SFE), and accelerated solvent extraction (ASE) can also be used. After extraction, phenolic acids are determined usually by liquid chromatography (LC) owing to the recent developments in this technique, especially when it is coupled with mass spectrometry (MS). Also detection systems are discussed, including UV-Vis, diode array, electrochemical and fluorimetric. Other popular techniques for the analysis of this group of secondary metabolites are gas chromatography coupled with mass spectrometry (GC-MS) and capillary electrophoresis (CE).

  6. Physicochemical and Antioxidant Properties of Rice Bran Oils Produced from Colored Rice Using Different Extraction Methods.

    Science.gov (United States)

    Mingyai, Sukanya; Kettawan, Aikkarach; Srikaeo, Khongsak; Singanusong, Riantong

    2017-06-01

    This study investigated the physicochemical and antioxidant properties of rice bran oil (RBO) produced from the bran of three rice varities; Khao Dawk Mali 105 (white rice), Red Jasmine rice (red rice) and Hom-nin rice (black rice) using three extraction methods including cold-press extraction (CPE), solvent extraction (SE) and supercritical CO 2 extraction (SC-CO 2 ). Yields, color, acid value (AV), free fatty acid (FFA), peroxide value (PV), iodine value (IV), total phenolic compound (TPC), γ-oryzanol, α-tocopherol and fatty acid profile were analyzed. It was found that the yields obtained from SE, SC-CO 2 and CPE extractions were 17.35-20.19%, 14.76-18.16% and 3.22-6.22%, respectively. The RBO from the bran of red and black rice samples exhibited high antioxidant activities. They also contained higher amount of γ-oryzanol and α-tocopherol than those of white rice sample. In terms of extraction methods, SC-CO 2 provided better qualities of RBO as evidenced by their physicochemical and antioxidant properties. This study found that RBO produced from the bran of black rice samples using SC-CO 2 extraction method showed the best physicochemical and antioxidant properties.

  7. AN ITERATIVE SEGMENTATION METHOD FOR REGION OF INTEREST EXTRACTION

    Directory of Open Access Journals (Sweden)

    Volkan CETIN

    2013-01-01

    Full Text Available In this paper, a method is presented for applications which include mammographic image segmentation and region of interest extraction. Segmentation is a very critical and difficult stage to accomplish in computer aided detection systems. Although the presented segmentation method is developed for mammographic images, it can be used for any medical image which resembles the same statistical characteristics with mammograms. Fundamentally, the method contains iterative automatic thresholding and masking operations which is applied to the original or enhanced mammograms. Also the effect of image enhancement to the segmentation process was observed. A version of histogram equalization was applied to the images for enhancement. Finally, the results show that enhanced version of the proposed segmentation method is preferable because of its better success rate.

  8. Calculation of radon concentration in water by toluene extraction method

    Energy Technology Data Exchange (ETDEWEB)

    Saito, Masaaki [Tokyo Metropolitan Isotope Research Center (Japan)

    1997-02-01

    Noguchi method and Horiuchi method have been used as the calculation method of radon concentration in water. Both methods have two problems in the original, that is, the concentration calculated is changed by the extraction temperature depend on the incorrect solubility data and the concentration calculated are smaller than the correct values, because the radon calculation equation does not true to the gas-liquid equilibrium theory. However, the two problems are solved by improving the radon equation. I presented the Noguchi-Saito equation and the constant B of Horiuchi-Saito equation. The calculating results by the improved method showed about 10% of error. (S.Y.)

  9. Method Development for Extraction of Butyrylcholin- esterase using Protein-G Agarose Spin Columns

    Directory of Open Access Journals (Sweden)

    Amruta S. Indapurkar

    2015-01-01

    Full Text Available Butyrylcholinesterase (BuChE is a biomarker of organophosphate (OP poisoning and can be used as a diagnostic marker to measure exposure to OP compounds. The purpose of this study was to develop a method to extract BuChE from human plasma. BuChE was extracted from plasma using the NAb protein-G Agarose Spin Kit. Factors affecting extraction like incubation time, plasma volume and cross-linking of antibodies to agarose beads were evaluated. All samples were analyzed for BuChE activity using the Ellman’s assay. The incubation times of plasma and anti-BuChE antibodies marginally affected the extraction efficiency of BuChE whereas a decrease in plasma volume increased the extraction efficiency. Cross-linking of anti-BuChE antibodies on agarose increased the extraction efficiency. The NAb protein-G Spin Kit can be used successfully to extract BuChE from human plasma. This extraction technique may be coupled to downstream analytical analyses for diagnosing exposure to OP compounds.

  10. A simple and rapid infrared-assisted self enzymolysis extraction method for total flavonoid aglycones extraction from Scutellariae Radix and mechanism exploration.

    Science.gov (United States)

    Wang, Liping; Duan, Haotian; Jiang, Jiebing; Long, Jiakun; Yu, Yingjia; Chen, Guiliang; Duan, Gengli

    2017-09-01

    A new, simple, and fast infrared-assisted self enzymolysis extraction (IRASEE) approach for the extraction of total flavonoid aglycones (TFA) mainly including baicalein, wogonin, and oroxylin A from Scutellariae Radix is presented to enhance extraction yield. Extraction enzymolysis temperature, enzymolysis liquid-to-solid ratio, enzymolysis pH, enzymolysis time and infrared power, the factors affecting IRASEE procedure, were investigated in a newly designed, temperature-controlled infrared-assisted extraction (TC-IRAE) system to acquire the optimum analysis conditions. The results illustrated that IRASEE possessed great advantages in terms of efficiency and time compared with other conventional extraction techniques. Furthermore, the mechanism of IRASEE was preliminarily explored by observing the microscopic change of the samples surface structures, studying the main chemical compositions change of the samples before and after extraction and investigating the kinetics and thermodynamics at three temperature levels during the IRASEE process. These findings revealed that IRASEE can destroy the surface microstructures to accelerate the mass transfer and reduce the activation energy to intensify the chemical process. This integrative study presents a simple, rapid, efficient, and environmental IRASEE method for TFA extraction which has promising prospects for other similar herbal medicines. Graphical Abstract ᅟ.

  11. Comparative evaluation of the anti-diabetic activity of Pterocarpus marsupium Roxb. heartwood in alloxan induced diabetic rats using extracts obtained by optimized conventional and non conventional extraction methods.

    Science.gov (United States)

    Devgan, Manish; Nanda, Arun; Ansari, Shahid Husain

    2013-09-01

    The aim of the present study was to assess the anti-diabetic activity of Pterocarpus marsupium Roxb. heartwood in alloxan induced diabetic rats using extracts obtained by optimized conventional and non conventional extraction methods. Aqueous and ethanol extracts of Pterocarpus marsupium heartwood were prepared by conventional methods (infusion, decoction, maceration and percolation) and non conventional methods, such as ultrasound-assisted extraction (UAE) and microwave-assisted extraction (MAE). The crude aqueous extracts were administered orally to both normal and alloxan induced male albino rats (Sprague-Dawley strain). The experimental set up consisted of 48 male albino rats divided into 6 groups: Normal control, diabetic control (sterile normal saline, 1 ml/100 g body weight), standard (gliclazide, 25 mg/1000g of body weight), groups 4-6 (crude aqueous percolation, optimized UAE and MAE extract, 250 mg/1000g of body weight). In acute treatment, the reduction of blood glucose level was statistically significant with the oral administration of UAE and percolation aqueous extracts to the hyperglycemic rats. In sub-acute treatment, the UAE aqueous extract led to consistent and statistically significant (p<0.001) reduction in the blood glucose levels. There was no abnormal change in body weight of the hyperglycemic animals after 10 days of administration of plant extracts and gliclazide. This study justifies the traditional claim and provides a rationale for the use of Pterocarpus marsupium to treat diabetes mellitus. The antidiabetic activity of Pterocarpus marsupium can be enhanced by extracting the heartwood by non conventional method of UAE.

  12. Simple extraction methods that prevent the artifactual conversion of chlorophyll to chlorophyllide during pigment isolation from leaf samples.

    Science.gov (United States)

    Hu, Xueyun; Tanaka, Ayumi; Tanaka, Ryouichi

    2013-06-19

    When conducting plant research, the measurement of photosynthetic pigments can provide basic information on the physiological status of a plant. High-pressure liquid chromatography (HPLC) is becoming widely used for this purpose because it provides an accurate determination of a variety of photosynthetic pigments simultaneously. This technique has a drawback compared with conventional spectroscopic techniques, however, in that it is more prone to structural modification of pigments during extraction, thus potentially generating erroneous results. During pigment extraction procedures with acetone or alcohol, the phytol side chain of chlorophyll is sometimes removed, forming chlorophyllide, which affects chlorophyll measurement using HPLC. We evaluated the artifactual chlorophyllide production during chlorophyll extraction by comparing different extraction methods with wild-type and mutant Arabidopsis leaves that lack the major isoform of chlorophyllase. Several extraction methods were compared to provide alternatives to researchers who utilize HPLC for the analysis of chlorophyll levels. As a result, the following three methods are recommended. In the first method, leaves are briefly boiled prior to extraction. In the second method, grinding and homogenization of leaves are performed at sub-zero temperatures. In the third method, N, N'-dimethylformamide (DMF) is used for the extraction of pigments. When compared, the first two methods eliminated almost all chlorophyllide-forming activity in Arabidopsis thaliana, Glebionis coronaria, Pisum sativum L. and Prunus sargentii Rehd. However, DMF effectively suppressed the activity of chlorophyllase only in Arabidopsis leaves. Chlorophyllide production in leaf extracts is predominantly an artifact. All three methods evaluated in this study reduce the artifactual production of chlorophyllide and are thus suitable for pigment extraction for HPLC analysis. The boiling method would be a practical choice when leaves are not too

  13. Extraction-spectrophotometric method for silicon determination in high-purity substances. 2. Silicon determination in cadmium

    Energy Technology Data Exchange (ETDEWEB)

    Yudelevich, I G; Shaburova, V P; Shamrina, L V [AN SSSR, Novosibirsk (USSR). Inst. Neorganicheskoj Khimii

    1989-01-01

    Cadmium extraction by tributyl phosphate and trialkylbenzylammonium chloride (TABAC) depending on acid (HCl, HI), extracting agent concentration, volume of aqueous and organic phases, number of extraction steps is investigated. On the basis of the obtained results the spectrophotometric method for silicon determination in cadmium and CdCl/sub 2/ using malachite green with preliminary extraction of the base by the TABAC from HCl solutions. The method detection limit is 3.9x10/sup -4/ % Si with respect to initial cadmium sample of 100 mg and 7.8x10/sup -5/ % with respect to 0.5 g of CdCl/sub 2/. The relative standard deviation is S/sub r/-0.07-0.13.

  14. AN EFFICIENT METHOD FOR AUTOMATIC ROAD EXTRACTION BASED ON MULTIPLE FEATURES FROM LiDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Li

    2016-06-01

    Full Text Available The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1 road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2 local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3 hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for “Urban Classification and 3D Building Reconstruction” project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.

  15. An Efficient Method for Automatic Road Extraction Based on Multiple Features from LiDAR Data

    Science.gov (United States)

    Li, Y.; Hu, X.; Guan, H.; Liu, P.

    2016-06-01

    The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D) points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1) road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2) local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3) hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform) proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for "Urban Classification and 3D Building Reconstruction" project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.

  16. A Multi-stage Method to Extract Road from High Resolution Satellite Image

    International Nuclear Information System (INIS)

    Zhijian, Huang; Zhang, Jinfang; Xu, Fanjiang

    2014-01-01

    Extracting road information from high-resolution satellite images is complex and hardly achieves by exploiting only one or two modules. This paper presents a multi-stage method, consisting of automatic information extraction and semi-automatic post-processing. The Multi-scale Enhancement algorithm enlarges the contrast of human-made structures with the background. The Statistical Region Merging segments images into regions, whose skeletons are extracted and pruned according to geometry shape information. Setting the start and the end skeleton points, the shortest skeleton path is constructed as a road centre line. The Bidirectional Adaptive Smoothing technique smoothens the road centre line and adjusts it to right position. With the smoothed line and its average width, a Buffer algorithm reconstructs the road region easily. Seen from the last results, the proposed method eliminates redundant non-road regions, repairs incomplete occlusions, jumps over complete occlusions, and reserves accurate road centre lines and neat road regions. During the whole process, only a few interactions are needed

  17. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    Directory of Open Access Journals (Sweden)

    Gemma Henderson

    Full Text Available Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However

  18. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    Science.gov (United States)

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  19. Chitin Extraction from Crustacean Shells Using Biological Methods – A Review

    Directory of Open Access Journals (Sweden)

    Wassila Arbia

    2013-01-01

    Full Text Available After cellulose, chitin is the most widespread biopolymer in nature. Chitin and its derivatives have great economic value because of their biological activities and their industrial and biomedical applications. It can be extracted from three sources, namely crustaceans, insects and microorganisms. However, the main commercial sources of chitin are shells of crustaceans such as shrimps, crabs, lobsters and krill that are supplied in large quantities by the shellfish processing industries. Extraction of chitin involves two steps, demineralisation and deproteinisation, which can be conducted by two methods, chemical or biological. The chemical method requires the use of acids and bases, while the biological method involves microorganisms. Although lactic acid bacteria are mainly applied, other microbial species including proteolytic bacteria have also been successfully implemented, as well as mixed cultures involving lactic acid-producing bacteria and proteolytic microorganisms. The produced lactic acid allows shell demineralisation, since lactic acid reacts with calcium carbonate, the main mineral component, to form calcium lactate.

  20. [Comparative study of different extraction methods and assays of tannins in some pteridophytes].

    Science.gov (United States)

    Laurent, S

    1975-10-01

    Various processes of extraction and quantitative analysis of a condensed tannin in a plant extract, which also includes some chlorogenic acids, have been examined. 60% methanol, at 50 degrees C, proved the most efficient extraction solvent. Several methods of analysis have been tried. The measure of the colour intensity obtained by the action of sulphuric vanilline on flavanols cannot be used because it depends on the tannin condensation stage. It is impossible to separate tannin from chlorogenic acids using the methods of adsorption by skin or nylon powders, or precipitation by polyvinylpyrrolidone. Only paper chromatography, followed by the distinct elution of the various phenolic compounds, allows the tannin evaluation by subtraction; but owing to the variability of the results, many more experiments are necessary. Some other processes are being studied.