WorldWideScience

Sample records for machine translation approach

  1. A MOOC on Approaches to Machine Translation

    Science.gov (United States)

    Costa-jussà, Mart R.; Formiga, Lluís; Torrillas, Oriol; Petit, Jordi; Fonollosa, José A. R.

    2015-01-01

    This paper describes the design, development, and analysis of a MOOC entitled "Approaches to Machine Translation: Rule-based, statistical and hybrid", and provides lessons learned and conclusions to be taken into account in the future. The course was developed within the Canvas platform, used by recognized European universities. It…

  2. Machine Translation

    Indian Academy of Sciences (India)

    Research Mt System Example: The 'Janus' Translating Phone Project. The Janus ... based on laptops, and simultaneous translation of two speakers in a dialogue. For more ..... The current focus in MT research is on using machine learning.

  3. English to Sanskrit Machine Translation Using Transfer Based approach

    Science.gov (United States)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  4. Machine translation

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, M

    1982-04-01

    Each language has its own structure. In translating one language into another one, language attributes and grammatical interpretation must be defined in an unambiguous form. In order to parse a sentence, it is necessary to recognize its structure. A so-called context-free grammar can help in this respect for machine translation and machine-aided translation. Problems to be solved in studying machine translation are taken up in the paper, which discusses subjects for semantics and for syntactic analysis and translation software. 14 references.

  5. A Character Level Based and Word Level Based Approach for Chinese-Vietnamese Machine Translation

    Directory of Open Access Journals (Sweden)

    Phuoc Tran

    2016-01-01

    Full Text Available Chinese and Vietnamese have the same isolated language; that is, the words are not delimited by spaces. In machine translation, word segmentation is often done first when translating from Chinese or Vietnamese into different languages (typically English and vice versa. However, it is a matter for consideration that words may or may not be segmented when translating between two languages in which spaces are not used between words, such as Chinese and Vietnamese. Since Chinese-Vietnamese is a low-resource language pair, the sparse data problem is evident in the translation system of this language pair. Therefore, while translating, whether it should be segmented or not becomes more important. In this paper, we propose a new method for translating Chinese to Vietnamese based on a combination of the advantages of character level and word level translation. In addition, a hybrid approach that combines statistics and rules is used to translate on the word level. And at the character level, a statistical translation is used. The experimental results showed that our method improved the performance of machine translation over that of character or word level translation.

  6. A Character Level Based and Word Level Based Approach for Chinese-Vietnamese Machine Translation.

    Science.gov (United States)

    Tran, Phuoc; Dinh, Dien; Nguyen, Hien T

    2016-01-01

    Chinese and Vietnamese have the same isolated language; that is, the words are not delimited by spaces. In machine translation, word segmentation is often done first when translating from Chinese or Vietnamese into different languages (typically English) and vice versa. However, it is a matter for consideration that words may or may not be segmented when translating between two languages in which spaces are not used between words, such as Chinese and Vietnamese. Since Chinese-Vietnamese is a low-resource language pair, the sparse data problem is evident in the translation system of this language pair. Therefore, while translating, whether it should be segmented or not becomes more important. In this paper, we propose a new method for translating Chinese to Vietnamese based on a combination of the advantages of character level and word level translation. In addition, a hybrid approach that combines statistics and rules is used to translate on the word level. And at the character level, a statistical translation is used. The experimental results showed that our method improved the performance of machine translation over that of character or word level translation.

  7. Machine Translation and Other Translation Technologies.

    Science.gov (United States)

    Melby, Alan

    1996-01-01

    Examines the application of linguistic theory to machine translation and translator tools, discusses the use of machine translation and translator tools in the real world of translation, and addresses the impact of translation technology on conceptions of language and other issues. Findings indicate that the human mind is flexible and linguistic…

  8. Machine Translation from Text

    Science.gov (United States)

    Habash, Nizar; Olive, Joseph; Christianson, Caitlin; McCary, John

    Machine translation (MT) from text, the topic of this chapter, is perhaps the heart of the GALE project. Beyond being a well defined application that stands on its own, MT from text is the link between the automatic speech recognition component and the distillation component. The focus of MT in GALE is on translating from Arabic or Chinese to English. The three languages represent a wide range of linguistic diversity and make the GALE MT task rather challenging and exciting.

  9. Machine Translation Effect on Communication

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Bjørn, Pernille

    2011-01-01

    Intercultural collaboration facilitated by machine translation has gradually spread in various settings. Still, little is known as for the practice of machine-translation mediated communication. This paper investigates how machine translation affects intercultural communication in practice. Based...... on communication in which multilingual communication system is applied, we identify four communication types and its’ influences on stakeholders’ communication process, especially focusing on establishment and maintenance of common ground. Different from our expectation that quality of machine translation results...

  10. Automatic Evaluation of Machine Translation

    DEFF Research Database (Denmark)

    Martinez, Mercedes Garcia; Koglin, Arlene; Mesa-Lao, Bartolomé

    2015-01-01

    The availability of systems capable of producing fairly accurate translations has increased the popularity of machine translation (MT). The translation industry is steadily incorporating MT in their workflows engaging the human translator to post-edit the raw MT output in order to comply with a s...

  11. Machine Translation for Academic Purposes

    Science.gov (United States)

    Lin, Grace Hui-chin; Chien, Paul Shih Chieh

    2009-01-01

    Due to the globalization trend and knowledge boost in the second millennium, multi-lingual translation has become a noteworthy issue. For the purposes of learning knowledge in academic fields, Machine Translation (MT) should be noticed not only academically but also practically. MT should be informed to the translating learners because it is a…

  12. Treatment of Markup in Statistical Machine Translation

    OpenAIRE

    Müller, Mathias

    2017-01-01

    We present work on handling XML markup in Statistical Machine Translation (SMT). The methods we propose can be used to effectively preserve markup (for instance inline formatting or structure) and to place markup correctly in a machine-translated segment. We evaluate our approaches with parallel data that naturally contains markup or where markup was inserted to create synthetic examples. In our experiments, hybrid reinsertion has proven the most accurate method to handle markup, while alignm...

  13. Machine Translation - A Gentle Introduction

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 7. Machine Translation - A Gentle Introduction. Durgesh D Rao. General Article Volume 3 Issue 7 July 1998 pp 61-70. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/003/07/0061-0070 ...

  14. Parsing statistical machine translation output

    NARCIS (Netherlands)

    Carter, S.; Monz, C.; Vetulani, Z.

    2009-01-01

    Despite increasing research into the use of syntax during statistical machine translation, the incorporation of syntax into language models has seen limited success. We present a study of the discriminative abilities of generative syntax-based language models, over and above standard n-gram models,

  15. MSD Recombination Method in Statistical Machine Translation

    Science.gov (United States)

    Gros, Jerneja Žganec

    2008-11-01

    Freely available tools and language resources were used to build the VoiceTRAN statistical machine translation (SMT) system. Various configuration variations of the system are presented and evaluated. The VoiceTRAN SMT system outperformed the baseline conventional rule-based MT system in all English-Slovenian in-domain test setups. To further increase the generalization capability of the translation model for lower-coverage out-of-domain test sentences, an "MSD-recombination" approach was proposed. This approach not only allows a better exploitation of conventional translation models, but also performs well in the more demanding translation direction; that is, into a highly inflectional language. Using this approach in the out-of-domain setup of the English-Slovenian JRC-ACQUIS task, we have achieved significant improvements in translation quality.

  16. Rule-based machine translation for Aymara

    NARCIS (Netherlands)

    Coler, Matthew; Homola, Petr; Jones, Mari

    2014-01-01

    This paper presents the ongoing result of an approach developed by the collaboration of a computational linguist with a field linguist that addresses one of the oft-overlooked keys to language maintenance: the development of modern language-learning tools. Although machine translation isn’t commonly

  17. Typologically robust statistical machine translation : Understanding and exploiting differences and similarities between languages in machine translation

    NARCIS (Netherlands)

    Daiber, J.

    2018-01-01

    Machine translation systems often incorporate modeling assumptions motivated by properties of the language pairs they initially target. When such systems are applied to language families with considerably different properties, translation quality can deteriorate. Phrase-based machine translation

  18. Machine Translation in Post-Contemporary Era

    Science.gov (United States)

    Lin, Grace Hui Chin

    2010-01-01

    This article focusing on translating techniques via personal computer or laptop reports updated artificial intelligence progresses before 2010. Based on interpretations and information for field of MT [Machine Translation] by Yorick Wilks' book, "Machine Translation, Its scope and limits," this paper displays understandable theoretical frameworks…

  19. Machine Translation Tools - Tools of The Translator's Trade

    DEFF Research Database (Denmark)

    Kastberg, Peter

    2012-01-01

    In this article three of the more common types of translation tools are presented, discussed and critically evaluated. The types of translation tools dealt with in this article are: Fully Automated Machine Translation (or FAMT), Human Aided Machine Translation (or HAMT) and Machine Aided Human...... Translation (or MAHT). The strengths and weaknesses of the different types of tools are discussed and evaluated by means of a number of examples. The article aims at two things: at presenting a sort of state of the art of what is commonly referred to as “machine translation” as well as at providing the reader...... with a sound basis for considering what translation tool (if any) is the most appropriate in order to meet his or her specific translation needs....

  20. Quantum neural network based machine translator for Hindi to English.

    Science.gov (United States)

    Narayan, Ravi; Singh, V P; Chakraverty, S

    2014-01-01

    This paper presents the machine learning based machine translation system for Hindi to English, which learns the semantically correct corpus. The quantum neural based pattern recognizer is used to recognize and learn the pattern of corpus, using the information of part of speech of individual word in the corpus, like a human. The system performs the machine translation using its knowledge gained during the learning by inputting the pair of sentences of Devnagri-Hindi and English. To analyze the effectiveness of the proposed approach, 2600 sentences have been evaluated during simulation and evaluation. The accuracy achieved on BLEU score is 0.7502, on NIST score is 6.5773, on ROUGE-L score is 0.9233, and on METEOR score is 0.5456, which is significantly higher in comparison with Google Translation and Bing Translation for Hindi to English Machine Translation.

  1. Machine vs. human translation of SNOMED CT terms.

    Science.gov (United States)

    Schulz, Stefan; Bernhardt-Melischnig, Johannes; Kreuzthaler, Markus; Daumke, Philipp; Boeker, Martin

    2013-01-01

    In the context of past and current SNOMED CT translation projects we compare three kinds of SNOMED CT translations from English to German by: (t1) professional medical translators; (t2) a free Web-based machine translation service; (t3) medical students. 500 SNOMED CT fully specified names from the (English) International release were randomly selected. Based on this, German translations t1, t2, and t3 were generated. A German and an Austrian physician rated the translations for linguistic correctness and content fidelity. Kappa for inter-rater reliability was 0.4 for linguistic correctness and 0.23 for content fidelity. Average ratings of linguistic correctness did not differ significantly between human translation scenarios. Content fidelity was rated slightly better for student translators compared to professional translators. Comparing machine to human translation, the linguistic correctness differed about 0.5 scale units in favour of the human translation and about 0.25 regarding content fidelity, equally in favour of the human translation. The results demonstrate that low-cost translation solutions of medical terms may produce surprisingly good results. Although we would not recommend low-cost translation for producing standardized preferred terms, this approach can be useful for creating additional language-specific entry terms. This may serve several important use cases. We also recommend testing this method to bootstrap a crowdsourcing process, by which term translations are gathered, improved, maintained, and rated by the user community.

  2. Dictionary Based Machine Translation from Kannada to Telugu

    Science.gov (United States)

    Sindhu, D. V.; Sagar, B. M.

    2017-08-01

    Machine Translation is a task of translating from one language to another language. For the languages with less linguistic resources like Kannada and Telugu Dictionary based approach is the best approach. This paper mainly focuses on Dictionary based machine translation for Kannada to Telugu. The proposed methodology uses dictionary for translating word by word without much correlation of semantics between them. The dictionary based machine translation process has the following sub process: Morph analyzer, dictionary, transliteration, transfer grammar and the morph generator. As a part of this work bilingual dictionary with 8000 entries is developed and the suffix mapping table at the tag level is built. This system is tested for the children stories. In near future this system can be further improved by defining transfer grammar rules.

  3. Translation Analysis on Civil Engineering Text Produced by Machine Translator

    Directory of Open Access Journals (Sweden)

    Sutopo Anam

    2018-01-01

    Full Text Available Translation is extremely needed in communication since people have serious problem in the language used. Translation activity is done by the person in charge for translating the material. Translation activity is also able to be done by machine. It is called machine translation, reflected in the programs developed by programmer. One of them is Transtool. Many people used Transtool for helping them in solving the problem related with translation activities. This paper wants to deliver how important is the Transtool program, how effective is Transtool program and how is the function of Transtool for human business. This study applies qualitative research. The sources of data were document and informant. This study used documentation and in dept-interviewing as the techniques for collecting data. The collected data were analyzed by using interactive analysis. The results of the study show that, first; Transtool program is helpful for people in translating the civil engineering text and it functions as the aid or helper, second; the working of Transtool software program is effective enough and third; the result of translation produced by Transtool is good for short and simple sentences and not readable, not understandable and not accurate for long sentences (compound, complex and compound complex thought the result is informative. The translated material must be edited by the professional translator.

  4. Translation Analysis on Civil Engineering Text Produced by Machine Translator

    Science.gov (United States)

    Sutopo, Anam

    2018-02-01

    Translation is extremely needed in communication since people have serious problem in the language used. Translation activity is done by the person in charge for translating the material. Translation activity is also able to be done by machine. It is called machine translation, reflected in the programs developed by programmer. One of them is Transtool. Many people used Transtool for helping them in solving the problem related with translation activities. This paper wants to deliver how important is the Transtool program, how effective is Transtool program and how is the function of Transtool for human business. This study applies qualitative research. The sources of data were document and informant. This study used documentation and in dept-interviewing as the techniques for collecting data. The collected data were analyzed by using interactive analysis. The results of the study show that, first; Transtool program is helpful for people in translating the civil engineering text and it functions as the aid or helper, second; the working of Transtool software program is effective enough and third; the result of translation produced by Transtool is good for short and simple sentences and not readable, not understandable and not accurate for long sentences (compound, complex and compound complex) thought the result is informative. The translated material must be edited by the professional translator.

  5. Using the TED Talks to Evaluate Spoken Post-editing of Machine Translation

    DEFF Research Database (Denmark)

    Liyanapathirana, Jeevanthi; Popescu-Belis, Andrei

    2016-01-01

    This paper presents a solution to evaluate spoken post-editing of imperfect machine translation output by a human translator. We compare two approaches to the combination of machine translation (MT) and automatic speech recognition (ASR): a heuristic algorithm and a machine learning method...

  6. Translating DVD Subtitles English-German, English-Japanese, Using Example-based Machine Translation

    DEFF Research Database (Denmark)

    Armstrong, Stephen; Caffrey, Colm; Flanagan, Marian

    2006-01-01

    Due to limited budgets and an ever-diminishing time-frame for the production of subtitles for movies released in cinema and DVD, there is a compelling case for a technology-based translation solution for subtitles. In this paper we describe how an Example-Based Machine Translation (EBMT) approach...... to the translation of English DVD subtitles into German and Japanese can aid the subtitler. Our research focuses on an EBMT tool that produces fully automated translations, which in turn can be edited if required. To our knowledge this is the first time that any EBMT approach has been used with DVD subtitle...

  7. Grammatical Metaphor, Controlled Languageand Machine Translation

    DEFF Research Database (Denmark)

    Møller, Margrethe

    2003-01-01

    It is a general assumption that 1) the readability and clarity of LSP texts written in a controlled language are better than uncontrolled texts and 2) that controlled languages produce better results with machine translation than uncontrolled languages. Controlled languages impose lexical...

  8. Using Linguistic Knowledge in Statistical Machine Translation

    Science.gov (United States)

    2010-09-01

    reproduced in (Belnap and Haeri, 1997)), a sociolinguistic phenomenon where the literary standard differs considerably from the vernacular varieties...Machine Translation Summit (MT-Summit). N. Haeri. 2000. Form and ideology: Arabic sociolinguistics and beyond. Annual Review of Anthropology, 29. D. Hakkani

  9. A Survey of Statistical Machine Translation

    Science.gov (United States)

    2007-04-01

    methods are notoriously sen- sitive to domain differences, however, so the move to informal text is likely to present many interesting challenges ...Och, Christoph Tillman, and Hermann Ney. Improved alignment models for statistical machine translation. In Proc. of EMNLP- VLC , pages 20–28, Jun 1999

  10. An analysis of machine translation and speech synthesis in speech-to-speech translation system

    OpenAIRE

    Hashimoto, K.; Yamagishi, J.; Byrne, W.; King, S.; Tokuda, K.

    2011-01-01

    This paper provides an analysis of the impacts of machine translation and speech synthesis on speech-to-speech translation systems. The speech-to-speech translation system consists of three components: speech recognition, machine translation and speech synthesis. Many techniques for integration of speech recognition and machine translation have been proposed. However, speech synthesis has not yet been considered. Therefore, in this paper, we focus on machine translation and speech synthesis, ...

  11. Findings of the 2010 Joint Workshop on Statistical Machine Translation and Metrics for Machine Translation

    NARCIS (Netherlands)

    Callison-Burch, C.; Koehn, P.; Monz, C.; Peterson, K.; Przybocki, M.; Zaidan, O.F.

    2010-01-01

    This paper presents the results of the WMT10 and MetricsMATR10 shared tasks, which included a translation task, a system combination task, and an evaluation task. We conducted a large-scale manual evaluation of 104 machine translation systems and 41 system combination entries. We used the ranking of

  12. Latent domain models for statistical machine translation

    NARCIS (Netherlands)

    Hoàng, C.

    2017-01-01

    A data-driven approach to model translation suffers from the data mismatch problem and demands domain adaptation techniques. Given parallel training data originating from a specific domain, training an MT system on the data would result in a rather suboptimal translation for other domains. But does

  13. The Impact of Machine Translation and Computer-aided Translation on Translators

    Science.gov (United States)

    Peng, Hao

    2018-03-01

    Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.

  14. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  15. Findings of the 2014 Workshop on Statistical Machine Translation

    NARCIS (Netherlands)

    Bojar, O.; Buck, C.; Federmann, C.; Haddow, B.; Koehn, P.; Leveling, J.; Monz, C.; Pecina, P.; Post, M.; Saint-Amand, H.; Soricut, R.; Specia, L.; Tamchyna, A.

    2014-01-01

    This paper presents the results of the WMT14 shared tasks, which included a standard news translation task, a separate medical translation task, a task for run-time estimation of machine translation quality, and a metrics task. This year, 143 machine translation systems from 23 institutions were

  16. An Overall Perspective of Machine Translation with its Shortcomings

    Directory of Open Access Journals (Sweden)

    Alireza Akbari

    2014-01-01

    Full Text Available The petition for language translation has strikingly augmented recently due to cross-cultural communication and exchange of information. In order to communicate well, text should be translated correctly and completely in each field such as legal documents, technical texts, scientific texts, publicity leaflets, and instructional materials. In this connection, Machine translation is of great importance in translation. The term “Machine Translation” was first proposed by George Artsrouni and Smirnov Troyanski (1933 to design a storage design on paper tape. This paper sought to investigate an overall perspective of Machine Translation models and its metrics in detail. Finally, it scrutinized the ins and outs shortcomings of Machine Translation.

  17. Using example-based machine translation to translate DVD subtitles

    DEFF Research Database (Denmark)

    Flanagan, Marian

    between Swedish and Danish and Swedish and Norwegian subtitles, with the company already reporting a successful return on their investment. The hybrid EBMT/SMT system used in the current research, on the other hand, remains within the confines of academic research, and the real potential of the system...... allotted to produce the subtitles have both decreased. Therefore, this market is recognised as a potential real-world application of MT. Recent publications have introduced Corpus-Based MT approaches to translate subtitles. An SMT system has been implemented in a Swedish subtitling company to translate...

  18. Findings of the 2011 workshop on statistical machine translation

    NARCIS (Netherlands)

    Callison-Burch, C.; Koehn, P.; Monz, C.; Zaidan, O.F.

    2011-01-01

    This paper presents the results of the WMT11 shared tasks, which included a translation task, a system combination task, and a task for machine translation evaluation metrics. We conducted a large-scale manual evaluation of 148 machine translation systems and 41 system combination entries. We used

  19. Evaluation of Hindi to Punjabi Machine Translation System

    OpenAIRE

    Goyal, Vishal; Lehal, Gurpreet Singh

    2009-01-01

    Machine Translation in India is relatively young. The earliest efforts date from the late 80s and early 90s. The success of every system is judged from its evaluation experimental results. Number of machine translation systems has been started for development but to the best of author knowledge, no high quality system has been completed which can be used in real applications. Recently, Punjabi University, Patiala, India has developed Punjabi to Hindi Machine translation system with high accur...

  20. Convolutional over Recurrent Encoder for Neural Machine Translation

    Directory of Open Access Journals (Sweden)

    Dakwale Praveen

    2017-06-01

    Full Text Available Neural machine translation is a recently proposed approach which has shown competitive results to traditional MT approaches. Standard neural MT is an end-to-end neural network where the source sentence is encoded by a recurrent neural network (RNN called encoder and the target words are predicted using another RNN known as decoder. Recently, various models have been proposed which replace the RNN encoder with a convolutional neural network (CNN. In this paper, we propose to augment the standard RNN encoder in NMT with additional convolutional layers in order to capture wider context in the encoder output. Experiments on English to German translation demonstrate that our approach can achieve significant improvements over a standard RNN-based baseline.

  1. Morphological Analysis for Statistical Machine Translation

    National Research Council Canada - National Science Library

    Lee, Young-Suk

    2004-01-01

    .... The technique improves Arabic-to-English translation qualities significantly when applied to IBM Model 1 and Phrase Translation Models trained on the training corpus size ranging from 3,500 to 3.3 million sentence pairs.

  2. ADAPTING HYBRID MACHINE TRANSLATION TECHNIQUES FOR CROSS-LANGUAGE TEXT RETRIEVAL SYSTEM

    Directory of Open Access Journals (Sweden)

    P. ISWARYA

    2017-03-01

    Full Text Available This research work aims in developing Tamil to English Cross - language text retrieval system using hybrid machine translation approach. The hybrid machine translation system is a combination of rule based and statistical based approaches. In an existing word by word translation system there are lot of issues and some of them are ambiguity, Out-of-Vocabulary words, word inflections, and improper sentence structure. To handle these issues, proposed architecture is designed in such a way that, it contains Improved Part-of-Speech tagger, machine learning based morphological analyser, collocation based word sense disambiguation procedure, semantic dictionary, and tense markers with gerund ending rules, and two pass transliteration algorithm. From the experimental results it is clear that the proposed Tamil Query based translation system achieves significantly better translation quality over existing system, and reaches 95.88% of monolingual performance.

  3. Bean Soup Translation: Flexible, Linguistically-Motivated Syntax for Machine Translation

    Science.gov (United States)

    Mehay, Dennis Nolan

    2012-01-01

    Machine translation (MT) systems attempt to translate texts from one language into another by translating words from a "source language" and rearranging them into fluent utterances in a "target language." When the two languages organize concepts in very different ways, knowledge of their general sentence structure, or…

  4. An Evaluation of Output Quality of Machine Translation (Padideh Software vs. Google Translate)

    Science.gov (United States)

    Azer, Haniyeh Sadeghi; Aghayi, Mohammad Bagher

    2015-01-01

    This study aims to evaluate the translation quality of two machine translation systems in translating six different text-types, from English to Persian. The evaluation was based on criteria proposed by Van Slype (1979). The proposed model for evaluation is a black-box type, comparative and adequacy-oriented evaluation. To conduct the evaluation, a…

  5. Telemedicine as a special case of machine translation.

    Science.gov (United States)

    Wołk, Krzysztof; Marasek, Krzysztof; Glinkowski, Wojciech

    2015-12-01

    Machine translation is evolving quite rapidly in terms of quality. Nowadays, we have several machine translation systems available in the web, which provide reasonable translations. However, these systems are not perfect, and their quality may decrease in some specific domains. This paper examines the effects of different training methods when it comes to Polish-English Statistical Machine Translation system used for the medical data. Numerous elements of the EMEA parallel text corpora and not related OPUS Open Subtitles project were used as the ground for creation of phrase tables and different language models including the development, tuning and testing of these translation systems. The BLEU, NIST, METEOR, and TER metrics have been used in order to evaluate the results of various systems. Our experiments deal with the systems that include POS tagging, factored phrase models, hierarchical models, syntactic taggers, and other alignment methods. We also executed a deep analysis of Polish data as preparatory work before automatized data processing such as true casing or punctuation normalization phase. Normalized metrics was used to compare results. Scores lower than 15% mean that Machine Translation engine is unable to provide satisfying quality, scores greater than 30% mean that translations should be understandable without problems and scores over 50 reflect adequate translations. The average results of Polish to English translations scores for BLEU, NIST, METEOR, and TER were relatively high and ranged from 7058 to 8272. The lowest score was 6438. The average results ranges for English to Polish translations were little lower (6758-7897). The real-life implementations of presented high quality Machine Translation Systems are anticipated in general medical practice and telemedicine. Copyright © 2015. Published by Elsevier Ltd.

  6. What does Attention in Neural Machine Translation Pay Attention to?

    NARCIS (Netherlands)

    Ghader, H.; Monz, C.; Kondrak, G.; Watanabe, T.

    2017-01-01

    Attention in neural machine translation provides the possibility to encode relevant parts of the source sentence at each translation step. As a result, attention is considered to be an alignment model as well. However, there is no work that specifically studies attention and provides analysis of

  7. Findings of the 2009 Workshop on Statistical Machine Translation

    NARCIS (Netherlands)

    Callison-Burch, C.; Koehn, P.; Monz, C.; Schroeder, J.; Callison-Burch, C.; Koehn, P.; Monz, C.; Schroeder, J.

    2009-01-01

    This paper presents the results of the WMT09 shared tasks, which included a translation task, a system combination task, and an evaluation task. We conducted a large-scale manual evaluation of 87 machine translation systems and 22 system combination entries. We used the ranking of these systems to

  8. Integrating Automatic Speech Recognition and Machine Translation for Better Translation Outputs

    DEFF Research Database (Denmark)

    Liyanapathirana, Jeevanthi

    translations, combining machine translation with computer assisted translation has drawn attention in current research. This combines two prospects: the opportunity of ensuring high quality translation along with a significant performance gain. Automatic Speech Recognition (ASR) is another important area......, which caters important functionalities in language processing and natural language understanding tasks. In this work we integrate automatic speech recognition and machine translation in parallel. We aim to avoid manual typing of possible translations as dictating the translation would take less time...... to the n-best list rescoring, we also use word graphs with the expectation of arriving at a tighter integration of ASR and MT models. Integration methods include constraining ASR models using language and translation models of MT, and vice versa. We currently develop and experiment different methods...

  9. The evolution and practical application of machine translation system (1)

    Science.gov (United States)

    Tominaga, Isao; Sato, Masayuki

    This paper describes a development, practical applicatioin, problem of a system, evaluation of practical system, and development trend of machine translation. Most recent system contains next four problems. 1) the vagueness of a text, 2) a difference of the definition of the terminology between different language, 3) the preparing of a large-scale translation dictionary, 4) the development of a software for the logical inference. Machine translation system is already used practically in many industry fields. However, many problems are not solved. The implementation of an ideal system will be after 15 years. Also, this paper described seven evaluation items detailedly. This English abstract was made by Mu system.

  10. Precise machine translation of computer science study

    CSIR Research Space (South Africa)

    Marais, L

    2015-07-01

    Full Text Available mobile (Android) application for translating discrete mathematics definitions between English and Afrikaans. The main component of the system is a Grammatical Framework (GF) application grammar which produces syntactically and semantically accurate...

  11. Approaches to translational plant science

    DEFF Research Database (Denmark)

    Dresbøll, Dorte Bodin; Christensen, Brian; Thorup-Kristensen, Kristian

    2015-01-01

    is lessened. In our opinion, implementation of translational plant science is a necessity in order to solve the agricultural challenges of producing food and materials in the future. We suggest an approach to translational plant science forcing scientists to think beyond their own area and to consider higher......Translational science deals with the dilemma between basic research and the practical application of scientific results. In translational plant science, focus is on the relationship between agricultural crop production and basic science in various research fields, but primarily in the basic plant...... science. Scientific and technological developments have allowed great progress in our understanding of plant genetics and molecular physiology, with potentials for improving agricultural production. However, this development has led to a separation of the laboratory-based research from the crop production...

  12. Machine Translation Using Constraint-Based Synchronous Grammar

    Institute of Scientific and Technical Information of China (English)

    WONG Fai; DONG Mingchui; HU Dongcheng

    2006-01-01

    A synchronous grammar based on the formalism of context-free grammar was developed by generalizing the first component of production that models the source text. Unlike other synchronous grammars,the grammar allows multiple target productions to be associated to a single production rule which can be used to guide a parser to infer different possible translational equivalences for a recognized input string according to the feature constraints of symbols in the pattern. An extended generalized LR algorithm was adapted to the parsing of the proposed formalism to analyze the syntactic structure of a language. The grammar was used as the basis for building a machine translation system for Portuguese to Chinese translation. The empirical results show that the grammar is more expressive when modeling the translational equivalences of parallel texts for machine translation and grammar rewriting applications.

  13. Machine translation with minimal reliance on parallel resources

    CERN Document Server

    Tambouratzis, George; Sofianopoulos, Sokratis

    2017-01-01

    This book provides a unified view on a new methodology for Machine Translation (MT). This methodology extracts information from widely available resources (extensive monolingual corpora) while only assuming the existence of a very limited parallel corpus, thus having a unique starting point to Statistical Machine Translation (SMT). In this book, a detailed presentation of the methodology principles and system architecture is followed by a series of experiments, where the proposed system is compared to other MT systems using a set of established metrics including BLEU, NIST, Meteor and TER. Additionally, a free-to-use code is available, that allows the creation of new MT systems. The volume is addressed to both language professionals and researchers. Prerequisites for the readers are very limited and include a basic understanding of the machine translation as well as of the basic tools of natural language processing.

  14. Neural Machine Translation with Recurrent Attention Modeling

    OpenAIRE

    Yang, Zichao; Hu, Zhiting; Deng, Yuntian; Dyer, Chris; Smola, Alex

    2016-01-01

    Knowing which words have been attended to in previous time steps while generating a translation is a rich source of information for predicting what words will be attended to in the future. We improve upon the attention model of Bahdanau et al. (2014) by explicitly modeling the relationship between previous and subsequent attention levels for each word using one recurrent network per input word. This architecture easily captures informative features, such as fertility and regularities in relat...

  15. A translator and simulator for the Burroughs D machine

    Science.gov (United States)

    Roberts, J.

    1972-01-01

    The D Machine is described as a small user microprogrammable computer designed to be a versatile building block for such diverse functions as: disk file controllers, I/O controllers, and emulators. TRANSLANG is an ALGOL-like language, which allows D Machine users to write microprograms in an English-like format as opposed to creating binary bit pattern maps. The TRANSLANG translator parses TRANSLANG programs into D Machine microinstruction bit patterns which can be executed on the D Machine simulator. In addition to simulation and translation, the two programs also offer several debugging tools, such as: a full set of diagnostic error messages, register dumps, simulated memory dumps, traces on instructions and groups of instructions, and breakpoints.

  16. Foreign Developments in Information Processing and Machine Translation, No. 1

    Science.gov (United States)

    1960-09-29

    technicians] (Sestier (A.) -- La Traduction automatfguT"" des textes ecrits scJQntifiqaes ej^J^chplc^es dxun langage~ dans__un’"*""* ’’^t^’T^^i...are more and more comprehensible to others than machine translation technicians will result. Sketch of a proaram. This outline of work xtfiich will

  17. A GRAMMATICAL ADJUSTMENT ANALYSIS OF STATISTICAL MACHINE TRANSLATION METHOD USED BY GOOGLE TRANSLATE COMPARED TO HUMAN TRANSLATION IN TRANSLATING ENGLISH TEXT TO INDONESIAN

    Directory of Open Access Journals (Sweden)

    Eko Pujianto

    2017-04-01

    Full Text Available Google translate is a program which provides fast, free and effortless translating service. This service uses a unique method to translate. The system is called ―Statistical Machine Translation‖, the newest method in automatic translation. Machine translation (MT is an area of many kinds of different subjects of study and technique from linguistics, computers science, artificial intelligent (AI, translation theory, and statistics. SMT works by using statistical methods and mathematics to process the training data. The training data is corpus-based. It is a compilation of sentences and words of the languages (SL and TL from translation done by human. By using this method, Google let their machine discovers the rules for themselves. They do this by analyzing millions of documents that have already been translated by human translators and then generate the result based on the corpus/training data. However, questions arise when the results of the automatic translation prove to be unreliable in some extent. This paper questions the dependability of Google translate in comparison with grammatical adjustment that naturally characterizes human translators' specific advantage. The attempt is manifested through the analysis of the TL of some texts translated by the SMT. It is expected that by using the sample of TL produced by SMT we can learn the potential flaws of the translation. If such exists, the partial of more substantial undependability of SMT may open more windows to the debates of whether this service may suffice the users‘ need.

  18. Local health department translation processes: potential of machine translation technologies to help meet needs.

    Science.gov (United States)

    Turner, Anne M; Mandel, Hannah; Capurro, Daniel

    2013-01-01

    Limited English proficiency (LEP), defined as a limited ability to read, speak, write, or understand English, is associated with health disparities. Despite federal and state requirements to translate health information, the vast majority of health materials are solely available in English. This project investigates barriers to translation of health information and explores new technologies to improve access to multilingual public health materials. We surveyed all 77 local health departments (LHDs) in the Northwest about translation needs, practices, barriers and attitudes towards machine translation (MT). We received 67 responses from 45 LHDs. Translation of health materials is the principle strategy used by LHDs to reach LEP populations. Cost and access to qualified translators are principle barriers to producing multilingual materials. Thirteen LHDs have used online MT tools. Many respondents expressed concerns about the accuracy of MT. Overall, respondents were positive about its potential use, if low costs and quality could be assured.

  19. Efficient Embedded Decoding of Neural Network Language Models in a Machine Translation System.

    Science.gov (United States)

    Zamora-Martinez, Francisco; Castro-Bleda, Maria Jose

    2018-02-22

    Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.

  20. Empirical Investigation of Optimization Algorithms in Neural Machine Translation

    Directory of Open Access Journals (Sweden)

    Bahar Parnia

    2017-06-01

    Full Text Available Training neural networks is a non-convex and a high-dimensional optimization problem. In this paper, we provide a comparative study of the most popular stochastic optimization techniques used to train neural networks. We evaluate the methods in terms of convergence speed, translation quality, and training stability. In addition, we investigate combinations that seek to improve optimization in terms of these aspects. We train state-of-the-art attention-based models and apply them to perform neural machine translation. We demonstrate our results on two tasks: WMT 2016 En→Ro and WMT 2015 De→En.

  1. INTEGRATING MACHINE TRANSLATION AND SPEECH SYNTHESIS COMPONENT FOR ENGLISH TO DRAVIDIAN LANGUAGE SPEECH TO SPEECH TRANSLATION SYSTEM

    Directory of Open Access Journals (Sweden)

    J. SANGEETHA

    2015-02-01

    Full Text Available This paper provides an interface between the machine translation and speech synthesis system for converting English speech to Tamil text in English to Tamil speech to speech translation system. The speech translation system consists of three modules: automatic speech recognition, machine translation and text to speech synthesis. Many procedures for incorporation of speech recognition and machine translation have been projected. Still speech synthesis system has not yet been measured. In this paper, we focus on integration of machine translation and speech synthesis, and report a subjective evaluation to investigate the impact of speech synthesis, machine translation and the integration of machine translation and speech synthesis components. Here we implement a hybrid machine translation (combination of rule based and statistical machine translation and concatenative syllable based speech synthesis technique. In order to retain the naturalness and intelligibility of synthesized speech Auto Associative Neural Network (AANN prosody prediction is used in this work. The results of this system investigation demonstrate that the naturalness and intelligibility of the synthesized speech are strongly influenced by the fluency and correctness of the translated text.

  2. Word Transition Entropy as an Indicator for Expected Machine Translation Quality

    DEFF Research Database (Denmark)

    Carl, Michael; Schaeffer, Moritz

    2014-01-01

    While most machine translation evaluation techniques (BLEU, NIST, TER, METEOR) assess translation quality based on a set of reference translations, we suggest to evaluate the literality of a set of (human or machine generated) translations to infer their potential quality. We provide evidence whi...

  3. Comparison of Three English-to-Dutch Machine Translations of SNOMED CT Procedures

    NARCIS (Netherlands)

    Cornet, Ronald; Hill, Carly; de Keizer, Nicolette

    2017-01-01

    Dutch interface terminologies are needed to use SNOMED CT in the Netherlands. Machine translation may support in their creation. The aim of our study is to compare different machine translations of procedures in SNOMED CT. Procedures were translated using Google Translate, Matecat, and Thot. Google

  4. Machine Translation as a Model for Overcoming Some Common Errors in English-into-Arabic Translation among EFL University Freshmen

    Science.gov (United States)

    El-Banna, Adel I.; Naeem, Marwa A.

    2016-01-01

    This research work aimed at making use of Machine Translation to help students avoid some syntactic, semantic and pragmatic common errors in translation from English into Arabic. Participants were a hundred and five freshmen who studied the "Translation Common Errors Remedial Program" prepared by the researchers. A testing kit that…

  5. The Hermeneutical Approach in Translation Studies

    Directory of Open Access Journals (Sweden)

    Bernd Stefanink

    2017-09-01

    Full Text Available Our aim is to convince the reader of the validity of the hermeneutical approach in translation studies. In a first part, we will show that this validity is based on the fact that the hermeneutical approach integrates factors like subjectivity, intuition, corporeality and creativity in its theoretical reflection, being thus close to the reality of the translation process. In a second part, we will situate this approach in the context of the development of modern translation studies since the 1950s, and show that this development was characterized by a dominating tendency that led from an atomistic to a more and more holistic view of the translation unit, legitimating the holistic approach, which is fundamental in translational hermeneutics. Our third part relates the history of philosophical hermeneutics as the legitimate foundation of translational hermeneutics. In a fourth part, devoted to the “outcoming perspectives”, we will try to reinforce the legitimacy of the hermeneutical approach by showing how it is supported by recent results of research in cognitive science. In order to foster further research in translational hermeneutics we also offer a methodology based on hermeneutic principles to study the translation process. Finally, we give an example of legitimation of a creative problemsolving based on a hermeneutical approach of a translation problem which finds its validation in the results of cognitive research.

  6. Analysis of MultiWord Expression Translation Errors in Statistical Machine Translation

    DEFF Research Database (Denmark)

    Klyueva, Natalia; Liyanapathirana, Jeevanthi

    2015-01-01

    In this paper, we analyse the usage of multiword expressions (MWE) in Statistical Machine Translation (SMT). We exploit the Moses SMT toolkit to train models for French-English and Czech-Russian language pairs. For each language pair, two models were built: a baseline model without additional MWE...... data and the model enhanced with information on MWE. For the French-English pair, we tried three methods of introducing the MWE data. For Czech-Russian pair, we used just one method – adding automatically extracted data as a parallel corpus....

  7. Machine Learning Approaches for Clinical Psychology and Psychiatry.

    Science.gov (United States)

    Dwyer, Dominic B; Falkai, Peter; Koutsouleris, Nikolaos

    2018-05-07

    Machine learning approaches for clinical psychology and psychiatry explicitly focus on learning statistical functions from multidimensional data sets to make generalizable predictions about individuals. The goal of this review is to provide an accessible understanding of why this approach is important for future practice given its potential to augment decisions associated with the diagnosis, prognosis, and treatment of people suffering from mental illness using clinical and biological data. To this end, the limitations of current statistical paradigms in mental health research are critiqued, and an introduction is provided to critical machine learning methods used in clinical studies. A selective literature review is then presented aiming to reinforce the usefulness of machine learning methods and provide evidence of their potential. In the context of promising initial results, the current limitations of machine learning approaches are addressed, and considerations for future clinical translation are outlined.

  8. Machine Translation as a complex system, and the phenomenon of Esperanto

    NARCIS (Netherlands)

    Gobbo, F.

    2015-01-01

    The history of machine translation and the history of Esperanto have long been connected, as they are two different ways to deal with the same problem: the problem of communication across language barriers. Language can be considered a Complex Adaptive System (CAS), and machine translation too. In

  9. Domain Adaptation for Machine Translation with Instance Selection

    Directory of Open Access Journals (Sweden)

    Biçici Ergun

    2015-04-01

    Full Text Available Domain adaptation for machine translation (MT can be achieved by selecting training instances close to the test set from a larger set of instances. We consider 7 different domain adaptation strategies and answer 7 research questions, which give us a recipe for domain adaptation in MT. We perform English to German statistical MT (SMT experiments in a setting where test and training sentences can come from different corpora and one of our goals is to learn the parameters of the sampling process. Domain adaptation with training instance selection can obtain 22% increase in target 2-gram recall and can gain up to 3:55 BLEU points compared with random selection. Domain adaptation with feature decay algorithm (FDA not only achieves the highest target 2-gram recall and BLEU performance but also perfectly learns the test sample distribution parameter with correlation 0:99. Moses SMT systems built with FDA selected 10K training sentences is able to obtain F1 results as good as the baselines that use up to 2M sentences. Moses SMT systems built with FDA selected 50K training sentences is able to obtain F1 point better results than the baselines.

  10. Machine learning approaches in medical image analysis

    DEFF Research Database (Denmark)

    de Bruijne, Marleen

    2016-01-01

    Machine learning approaches are increasingly successful in image-based diagnosis, disease prognosis, and risk assessment. This paper highlights new research directions and discusses three main challenges related to machine learning in medical imaging: coping with variation in imaging protocols......, learning from weak labels, and interpretation and evaluation of results....

  11. Machine learning an artificial intelligence approach

    CERN Document Server

    Banerjee, R; Bradshaw, Gary; Carbonell, Jaime Guillermo; Mitchell, Tom Michael; Michalski, Ryszard Spencer

    1983-01-01

    Machine Learning: An Artificial Intelligence Approach contains tutorial overviews and research papers representative of trends in the area of machine learning as viewed from an artificial intelligence perspective. The book is organized into six parts. Part I provides an overview of machine learning and explains why machines should learn. Part II covers important issues affecting the design of learning programs-particularly programs that learn from examples. It also describes inductive learning systems. Part III deals with learning by analogy, by experimentation, and from experience. Parts IV a

  12. Modeling and prediction of human word search behavior in interactive machine translation

    Science.gov (United States)

    Ji, Duo; Yu, Bai; Ma, Bin; Ye, Na

    2017-12-01

    As a kind of computer aided translation method, Interactive Machine Translation technology reduced manual translation repetitive and mechanical operation through a variety of methods, so as to get the translation efficiency, and played an important role in the practical application of the translation work. In this paper, we regarded the behavior of users' frequently searching for words in the translation process as the research object, and transformed the behavior to the translation selection problem under the current translation. The paper presented a prediction model, which is a comprehensive utilization of alignment model, translation model and language model of the searching words behavior. It achieved a highly accurate prediction of searching words behavior, and reduced the switching of mouse and keyboard operations in the users' translation process.

  13. An Evaluation of Online Machine Translation of Arabic into English News Headlines: Implications on Students' Learning Purposes

    Science.gov (United States)

    Kadhim, Kais A.; Habeeb, Luwaytha S.; Sapar, Ahmad Arifin; Hussin, Zaharah; Abdullah, Muhammad Ridhuan Tony Lim

    2013-01-01

    Nowadays, online Machine Translation (MT) is used widely with translation software, such as Google and Babylon, being easily available and downloadable. This study aims to test the translation quality of these two machine systems in translating Arabic news headlines into English. 40 Arabic news headlines were selected from three online sources,…

  14. A Conjoint Analysis Framework for Evaluating User Preferences in Machine Translation.

    Science.gov (United States)

    Kirchhoff, Katrin; Capurro, Daniel; Turner, Anne M

    2014-03-01

    Despite much research on machine translation (MT) evaluation, there is surprisingly little work that directly measures users' intuitive or emotional preferences regarding different types of MT errors. However, the elicitation and modeling of user preferences is an important prerequisite for research on user adaptation and customization of MT engines. In this paper we explore the use of conjoint analysis as a formal quantitative framework to assess users' relative preferences for different types of translation errors. We apply our approach to the analysis of MT output from translating public health documents from English into Spanish. Our results indicate that word order errors are clearly the most dispreferred error type, followed by word sense, morphological, and function word errors. The conjoint analysis-based model is able to predict user preferences more accurately than a baseline model that chooses the translation with the fewest errors overall. Additionally we analyze the effect of using a crowd-sourced respondent population versus a sample of domain experts and observe that main preference effects are remarkably stable across the two samples.

  15. The Dostoevsky Machine in Georgetown: scientific translation in the Cold War.

    Science.gov (United States)

    Gordin, Michael D

    2016-04-01

    Machine Translation (MT) is now ubiquitous in discussions of translation. The roots of this phenomenon - first publicly unveiled in the so-called 'Georgetown-IBM Experiment' on 9 January 1954 - displayed not only the technological utopianism still associated with dreams of a universal computer translator, but was deeply enmeshed in the political pressures of the Cold War and a dominating conception of scientific writing as both the goal of machine translation as well as its method. Machine translation was created, in part, as a solution to a perceived crisis sparked by the massive expansion of Soviet science. Scientific prose was also perceived as linguistically simpler, and so served as the model for how to turn a language into a series of algorithms. This paper follows the rise of the Georgetown program - the largest single program in the world - from 1954 to the (as it turns out, temporary) collapse of MT in 1964.

  16. An Overall Perspective of Machine Translation with Its Shortcomings

    Science.gov (United States)

    Akbari, Alireza

    2014-01-01

    The petition for language translation has strikingly augmented recently due to cross-cultural communication and exchange of information. In order to communicate well, text should be translated correctly and completely in each field such as legal documents, technical texts, scientific texts, publicity leaflets, and instructional materials. In this…

  17. An Evaluative Study of Machine Translation in the EFL Scenario of Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Raneem Khalid Al-Tuwayrish

    2016-02-01

    Full Text Available Artificial Intelligence or AI as it is popularly known and its corollary, Machine Translation (MT have long engaged scientists, thinkers and linguists alike in the twenty first century. However, the wider question that lies in the relation between technology and translation is, What does technology do to language? This is an important question in the current paradigm because new translation technologies, such as, translation memories, data-based machine translation, and collaborative translation, far from being just additional tools, are changing the very nature of the translators’ cognitive activity, social relations, and professional standing. In fact, in some translation situations such as when translating technical materials or subject matter that are not a specialization with human translators, one potentially needs technology.  The purview of this paper, however, is limited to the role of MT in day to day situations where the generic MT tools like Google Translate or Bing Translator are encouraged. Further, it endeavours to weigh and empirically demonstrate the pros and cons of MT with a view to recommending measures for better communication training in the EFL set up of Saudi Arabia. Keywords: AI, MT, translation, technology, necessity, communication

  18. Implementing Professional Approach within a Translation

    Directory of Open Access Journals (Sweden)

    Nagwa ElShafei

    2014-03-01

    Full Text Available The recent and fast development in various spheres of information and communication technology, global trade, digital and social media have resulted in growth in excellent employment opportunities but also influenced the labor market. For instance, some jobs have become absolute, while others, related to information technology particularly, have become in higher demand. As such, there are many scenarios in which translators find themselves unable to communicate with their clients due to cultural and language barriers, especially in labor market environment. This clarifies the great need for translators to receive professional training which also takes into account the advancement in technology. Therefore, market demands should be taken into account when developing and planning university courses and curricula to meet the job market needs. Courses on translation and interpretation prepare professional translators as needed by the labor market. In other words, the role of academic professional and curriculum planners should be narrowing the gap between what the labor market needs from the modern translator and the courses offered by training institutions, universities and colleges. This research study introduces a Professional Approach to teacher to educate translators within the faculty of arts, in a manner that fits the requirements of the job market. As such, a unit was prepared and specified for the students, then taught by the researcher to the selected sample. The dependent t-test technique was employed to compare the means of the total scores of the experimental group on the proficiency pre-post administration of the tests. It was noted from the results that there is a notable difference between the mean scores of the two groups in favor of the experimental group.

  19. Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation

    NARCIS (Netherlands)

    Martzoukos, S.; Costa Florêncio, C.; Monz, C.; Kornai, A.; Kuhlmann, M.

    2013-01-01

    The consistency method has been established as the standard strategy for extracting high quality translation rules in statistical machine translation (SMT). However, no attention has been drawn to why this method is successful, other than empirical evidence. Using concepts from graph theory, we

  20. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  1. An Evaluative Study of Machine Translation in the EFL Scenario of Saudi Arabia

    Science.gov (United States)

    Al-Tuwayrish, Raneem Khalid

    2016-01-01

    Artificial Intelligence or AI as it is popularly known and its corollary, Machine Translation (MT) have long engaged scientists, thinkers and linguists alike in the twenty first century. However, the wider question that lies in the relation between technology and translation is, What does technology do to language? This is an important question in…

  2. Integrating source-language context into phrase-based statistical machine translation

    NARCIS (Netherlands)

    Haque, R.; Kumar Naskar, S.; Bosch, A.P.J. van den; Way, A.

    2011-01-01

    The translation features typically used in Phrase-Based Statistical Machine Translation (PB-SMT) model dependencies between the source and target phrases, but not among the phrases in the source language themselves. A swathe of research has demonstrated that integrating source context modelling

  3. Recycling Texts: Human evaluation of example-based machine translation subtitles for DVD

    DEFF Research Database (Denmark)

    Flanagan, Marian

    2009-01-01

    This project focuses on translation reusability in audiovisual contexts. Specifically, the project seeks to establish (1) whether target language subtitles produced by an Example-Based Machine Translation (EBMT) system are considered intelligible and acceptable by viewers of movies on DVD, and (2...

  4. Technology: English Learners and Machine Translation, Part 2

    Science.gov (United States)

    Van Horn, Royal

    2004-01-01

    In this article, the author touches on the ways that technology can come to the aid of teachers with students who don't speak English. He discusses different word processors that successfully translate foreign text.

  5. Designing Course An Initial Approach To Translation Teaching

    Directory of Open Access Journals (Sweden)

    Roswani Siregar

    2017-09-01

    Full Text Available Along with the human history translation is the sustainable communication tool among the cultures to preserve this knowledge from generation to generations. Undoubtedly both translation plays a very important role in an increasingly globalized world and translators have the prominent roles in the development of countries. Many translators really enjoy their work but hesitated to teach a course due to their lack of pedagogical knowledge and believe that the translation skill is gained by personal experiences and talents. Thus this paper attempt to promote the translation teaching in classroom by set the preliminary approach to teach translation. The sequences of teaching design are described by propose the brief definition to the nature of translation the importance translation teaching the translator competence and design of translation course. This paper is the preliminary approach to translation teaching for beginners in university setting.

  6. Functional approaches in translation studies in Germany Functional approaches in translation studies in Germany

    Directory of Open Access Journals (Sweden)

    Paul Kussmaul

    2008-04-01

    Full Text Available In the early phase of translation studies in Germany, contrastive linguistics played a major role. I shall briefly describe this approach so that the functional approach will become clearer by contrast. Influenced by the representatives of stylistique comparée, Vinay/Darbelnet (1968 Wolfram Wilss, for instance, in his early work (1971, 1977 makes frequent use of the notion transposition (German “Ausdrucksverschiebung“, cf. also Catford’s (1965 term shift. As a whole, of course, Wilss’ work has a much broader scope. More recently, he has investigated the role of cognition (1988 and the various factors in translator behaviour (1996. Nevertheless, transposition is still a very important and useful notion in describing the translation process. The need for transpositions arises when there is no possibility of formal one-to-one correspondence between source and target-language structures. The basic idea is that whenever there is a need for transposition, we are faced with a translation problem. In the early phase of translation studies in Germany, contrastive linguistics played a major role. I shall briefly describe this approach so that the functional approach will become clearer by contrast. Influenced by the representatives of stylistique comparée, Vinay/Darbelnet (1968 Wolfram Wilss, for instance, in his early work (1971, 1977 makes frequent use of the notion transposition (German “Ausdrucksverschiebung“, cf. also Catford’s (1965 term shift. As a whole, of course, Wilss’ work has a much broader scope. More recently, he has investigated the role of cognition (1988 and the various factors in translator behaviour (1996. Nevertheless, transposition is still a very important and useful notion in describing the translation process. The need for transpositions arises when there is no possibility of formal one-to-one correspondence between source and target-language structures. The basic idea is that whenever there is a need for

  7. Thomas Mofolo's sentence design in Chaka approached in translation

    African Journals Online (AJOL)

    Thomas Mofolo's sentence design in Chaka approached in translation. ... by responding to several compelling questions, ranging from how five translators of the work approached it in their respective languages ... AJOL African Journals Online.

  8. Modeling workflow to design machine translation applications for public health practice.

    Science.gov (United States)

    Turner, Anne M; Brownstein, Megumu K; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2015-02-01

    Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Predicting post-translational lysine acetylation using support vector machines

    DEFF Research Database (Denmark)

    Gnad, Florian; Ren, Shubin; Choudhary, Chunaram

    2010-01-01

    spectrometry to identify 3600 lysine acetylation sites on 1750 human proteins covering most of the previously annotated sites and providing the most comprehensive acetylome so far. This dataset should provide an excellent source to train support vector machines (SVMs) allowing the high accuracy in silico...

  10. Bombsights and Adding Machines: Translating Wartime Technology into Peacetime Sales

    Science.gov (United States)

    Tremblay, Michael

    2010-01-01

    On 10 February 1947, A.C. Buehler, the president of the Victor Adding Machine Company presented Norden Bombsight #4120 to the Smithsonian Institute. This sight was in service on board the Enola Gay when it dropped the first atomic bomb on Hiroshima. Through this public presentation, Buehler forever linked his company to the Norden Bombsight, the…

  11. Data extraction from machine-translated versus original language randomized trial reports: a comparative study.

    Science.gov (United States)

    Balk, Ethan M; Chung, Mei; Chen, Minghua L; Chang, Lina Kong Win; Trikalinos, Thomas A

    2013-11-07

    Google Translate offers free Web-based translation, but it is unknown whether its translation accuracy is sufficient to use in systematic reviews to mitigate concerns about language bias. We compared data extraction from non-English language studies with extraction from translations by Google Translate of 10 studies in each of five languages (Chinese, French, German, Japanese and Spanish). Fluent speakers double-extracted original-language articles. Researchers who did not speak the given language double-extracted translated articles along with 10 additional English language trials. Using the original language extractions as a gold standard, we estimated the probability and odds ratio of correctly extracting items from translated articles compared with English, adjusting for reviewer and language. Translation required about 30 minutes per article and extraction of translated articles required additional extraction time. The likelihood of correct extractions was greater for study design and intervention domain items than for outcome descriptions and, particularly, study results. Translated Spanish articles yielded the highest percentage of items (93%) that were correctly extracted more than half the time (followed by German and Japanese 89%, French 85%, and Chinese 78%) but Chinese articles yielded the highest percentage of items (41%) that were correctly extracted >98% of the time (followed by Spanish 30%, French 26%, German 22%, and Japanese 19%). In general, extractors' confidence in translations was not associated with their accuracy. Translation by Google Translate generally required few resources. Based on our analysis of translations from five languages, using machine translation has the potential to reduce language bias in systematic reviews; however, pending additional empirical data, reviewers should be cautious about using translated data. There remains a trade-off between completeness of systematic reviews (including all available studies) and risk of

  12. Syntactic discriminative language model rerankers for statistical machine translation

    NARCIS (Netherlands)

    Carter, S.; Monz, C.

    2011-01-01

    This article describes a method that successfully exploits syntactic features for n-best translation candidate reranking using perceptrons. We motivate the utility of syntax by demonstrating the superior performance of parsers over n-gram language models in differentiating between Statistical

  13. Some Problems in German to English Machine Translation

    Science.gov (United States)

    1974-12-01

    fron Benanti^e is a slippery business, especially when I have just clalwsd to subscribe to the idea that the structure of an utterance is intinately...from the English translation on page 15, the example paragraph can be divided Into elm 134 sections. These diviaions can be characterized at

  14. A user-based usability assessment of raw machine translated technical instructions

    OpenAIRE

    Doherty, Stephen; O'Brien, Sharon

    2012-01-01

    Despite the growth of statistical machine translation (SMT) research and development in recent years, it remains somewhat out of reach for the translation community where programming expertise and knowledge of statistics tend not to be commonplace. While the concept of SMT is relatively straightforward, its implementation in functioning systems remains difficult for most, regardless of expertise. More recently, however, developments such as SmartMATE have emerged which aim to assist users in ...

  15. Percussive drilling application of translational motion permanent magnet machine

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Shujun

    2012-07-01

    It is clear that percussive drills are very promising since they can increase the rate of penetration in hard rock formations. Any small improvements on the percussive drills can make a big contribution to lowering the drilling costs since drilling a well for the oil and gas industry is very costly. This thesis presents a percussive drilling system mainly driven by a tubular reciprocating translational motion permanent magnet synchronous motor (RTPMSM), which efficiently converts electric energy to kinetic energy for crushing the hard rock since there is no mechanical media. The thesis starts from state-of-the-art of percussive drilling techniques, reciprocating translational motion motors, and self-sensing control of electric motors and its implementation issues. The following chapters present modeling the hard rock, modeling the drill, the design issues of the drill, the RTPMSM and its control. A single-phase RTPMSM prototype is tested for the hard rock drilling. The presented variable voltage variable frequency control is also validated on it. The space vector control and self-sensing control are also explored on a three-phase RTPMSM prototype. The results show that the percussive drill can be implemented to the hard rock drilling applications. A detailed summarisation of contributions and future work is presented at the end of the thesis.(Author)

  16. Machine Learning Approaches in Cardiovascular Imaging.

    Science.gov (United States)

    Henglin, Mir; Stein, Gillian; Hushcha, Pavel V; Snoek, Jasper; Wiltschko, Alexander B; Cheng, Susan

    2017-10-01

    Cardiovascular imaging technologies continue to increase in their capacity to capture and store large quantities of data. Modern computational methods, developed in the field of machine learning, offer new approaches to leveraging the growing volume of imaging data available for analyses. Machine learning methods can now address data-related problems ranging from simple analytic queries of existing measurement data to the more complex challenges involved in analyzing raw images. To date, machine learning has been used in 2 broad and highly interconnected areas: automation of tasks that might otherwise be performed by a human and generation of clinically important new knowledge. Most cardiovascular imaging studies have focused on task-oriented problems, but more studies involving algorithms aimed at generating new clinical insights are emerging. Continued expansion in the size and dimensionality of cardiovascular imaging databases is driving strong interest in applying powerful deep learning methods, in particular, to analyze these data. Overall, the most effective approaches will require an investment in the resources needed to appropriately prepare such large data sets for analyses. Notwithstanding current technical and logistical challenges, machine learning and especially deep learning methods have much to offer and will substantially impact the future practice and science of cardiovascular imaging. © 2017 American Heart Association, Inc.

  17. Personalized translational epilepsy research - Novel approaches and future perspectives: Part II: Experimental and translational approaches.

    Science.gov (United States)

    Bauer, Sebastian; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Rosenow, Felix

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics, and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. This Part II includes the experimental and translational approaches and a discussion of the future perspectives, while the diagnostic methods, EEG network analysis, biomarkers, and personalized treatment approaches were addressed in Part I [1]. Copyright © 2017

  18. Distinguishing Asthma Phenotypes Using Machine Learning Approaches.

    Science.gov (United States)

    Howard, Rebecca; Rattray, Magnus; Prosperi, Mattia; Custovic, Adnan

    2015-07-01

    Asthma is not a single disease, but an umbrella term for a number of distinct diseases, each of which are caused by a distinct underlying pathophysiological mechanism. These discrete disease entities are often labelled as 'asthma endotypes'. The discovery of different asthma subtypes has moved from subjective approaches in which putative phenotypes are assigned by experts to data-driven ones which incorporate machine learning. This review focuses on the methodological developments of one such machine learning technique-latent class analysis-and how it has contributed to distinguishing asthma and wheezing subtypes in childhood. It also gives a clinical perspective, presenting the findings of studies from the past 5 years that used this approach. The identification of true asthma endotypes may be a crucial step towards understanding their distinct pathophysiological mechanisms, which could ultimately lead to more precise prevention strategies, identification of novel therapeutic targets and the development of effective personalized therapies.

  19. Finding Translation Examples for Under-Resourced Language Pairs or for Narrow Domains; the Case for Machine Translation

    Directory of Open Access Journals (Sweden)

    Dan Tufis

    2012-07-01

    Full Text Available The cyberspace is populated with valuable information sources, expressed in about 1500 different languages and dialects. Yet, for the vast majority of WEB surfers this wealth of information is practically inaccessible or meaningless. Recent advancements in cross-lingual information retrieval, multilingual summarization, cross-lingual question answering and machine translation promise to narrow the linguistic gaps and lower the communication barriers between humans and/or software agents. Most of these language technologies are based on statistical machine learning techniques which require large volumes of cross lingual data. The most adequate type of cross-lingual data is represented by parallel corpora, collection of reciprocal translations. However, it is not easy to find enough parallel data for any language pair might be of interest. When required parallel data refers to specialized (narrow domains, the scarcity of data becomes even more acute. Intelligent information extraction techniques from comparable corpora provide one of the possible answers to this lack of translation data.

  20. Improving the quality of automated DVD subtitles via example-based machine translation

    DEFF Research Database (Denmark)

    Armstrong, Stephen; Caffrey, Colm; Flanagan, Marian

    Denoual (2005) discovered that, contrary to popular belief, an Example-Based Machine Translation system trained on heterogeneous data produced significantly better results than a system trained on homogeneous data. Using similar evaluation metrics and a few additional ones, in this paper we show...

  1. Crawl and crowd to bring machine translation to under-resourced languages

    NARCIS (Netherlands)

    Toral Ruiz, Antonio

    2017-01-01

    We present a widely applicable methodology to bring machine translation (MT) to under-resourced languages in a cost-effective and rapid manner. Our proposal relies on web crawling to automatically acquire parallel data to train statistical MT systems if any such data can be found for the language

  2. Language Model Adaptation Using Machine-Translated Text for Resource-Deficient Languages

    Directory of Open Access Journals (Sweden)

    Sadaoki Furui

    2009-01-01

    Full Text Available Text corpus size is an important issue when building a language model (LM. This is a particularly important issue for languages where little data is available. This paper introduces an LM adaptation technique to improve an LM built using a small amount of task-dependent text with the help of a machine-translated text corpus. Icelandic speech recognition experiments were performed using data, machine translated (MT from English to Icelandic on a word-by-word and sentence-by-sentence basis. LM interpolation using the baseline LM and an LM built from either word-by-word or sentence-by-sentence translated text reduced the word error rate significantly when manually obtained utterances used as a baseline were very sparse.

  3. An Image Processing Approach to Linguistic Translation

    Science.gov (United States)

    Kubatur, Shruthi; Sreehari, Suhas; Hegde, Rajeshwari

    2011-12-01

    The art of translation is as old as written literature. Developments since the Industrial Revolution have influenced the practice of translation, nurturing schools, professional associations, and standard. In this paper, we propose a method of translation of typed Kannada text (taken as an image) into its equivalent English text. The National Instruments (NI) Vision Assistant (version 8.5) has been used for Optical character Recognition (OCR). We developed a new way of transliteration (which we call NIV transliteration) to simplify the training of characters. Also, we build a special type of dictionary for the purpose of translation.

  4. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  5. TRANSLATING AS A PURPOSEFUL ACTIVITY:A PROSPECTIVE APPROACH

    Directory of Open Access Journals (Sweden)

    Christiane Nord

    2006-01-01

    Full Text Available Taking a prospective approach to translation, translators choose their translation strategies according to the purpose or function the translated text is intended to fulfill for the target audience. Since communicative purposes need certain conditions in order to work, it is the translator's task to analyze the conditions of the target culture and to decide whether, and how, the source-text purposes can work for the target audience according to the specifications of the translation brief. If the target-culture conditions differ from those of the source culture, there are usually two basic options: either to transform the text in such a way that it can work under target-culture conditions (= instrumental translation, or to replace the source-text functions by their respective meta-functions (= documentary translation.

  6. Translation in language learning: a ‘what for’ approach

    Directory of Open Access Journals (Sweden)

    Paolo E. Balboni

    2017-12-01

    Full Text Available Literature about translation in language learning and teaching shows the prominence of the ‘for and against’ approach, while a ‘what for’ approach would be more profitable. In order to prevent the latter approach from becoming a random list of the potential benefits of the use of translation in language teaching, this essay suggests the use of a formal model of communicative competence, to see which of its components can profit of translation activities. The result is a map of the effects of translation in the wide range of competences and abilities which constitute language learning.

  7. CloudLM: a Cloud-based Language Model for Machine Translation

    Directory of Open Access Journals (Sweden)

    Ferrández-Tordera Jorge

    2016-04-01

    Full Text Available Language models (LMs are an essential element in statistical approaches to natural language processing for tasks such as speech recognition and machine translation (MT. The advent of big data leads to the availability of massive amounts of data to build LMs, and in fact, for the most prominent languages, using current techniques and hardware, it is not feasible to train LMs with all the data available nowadays. At the same time, it has been shown that the more data is used for a LM the better the performance, e.g. for MT, without any indication yet of reaching a plateau. This paper presents CloudLM, an open-source cloud-based LM intended for MT, which allows to query distributed LMs. CloudLM relies on Apache Solr and provides the functionality of state-of-the-art language modelling (it builds upon KenLM, while allowing to query massive LMs (as the use of local memory is drastically reduced, at the expense of slower decoding speed.

  8. Identity approach in translation : sociocultural implications

    Directory of Open Access Journals (Sweden)

    Alicja Żuchelkowska

    2012-01-01

    Full Text Available The objective of this text consists in presenting how it is necessary for contemporary translators and interpreters (both literary and specialised to acquire and develop the ability to recognize elements of identity discourse in translated texts. Nowadays, the need for inter-cultural exchange is inevitably connected with the necessity of establishing harmonious co-existence for numerous cultures and identities. Therefore, it is crucial to educate translators in a way that enables them to pay special attention to identity and cultural perturbations present in translated texts (culture and language hybridisation, multiple identity, cultural dislocation, presence in linguistic and political discourse of minority cultures, regardless of their genre or form. Such a strong emphasis on identity problems in the translation is especially relevant in the European context, where the attention of researchers and politicians directed at identity problems stemming from ethnical and cultural issues sets the framework for a new cultural paradigm that determines the future development of the Eu. Becoming acquainted with this paradigm which emphasises fl uency, identity unmarkedness and the new model of European collectivity is indispensable for a translator aspiring to become a true cultural mediator.

  9. Machine translation (MT): qualità, produttività, customer satisfaction

    OpenAIRE

    Fellet, Anna

    2010-01-01

    The aim of the present research is to examine the impact of recent technological developments in machine translation (MT) in the language industry. The objectives are: 1. To define the value of MT in terms of suitability and convenience in meeting expressed requirements in those cases where MT is demanded; 2. To examine the potential increase in productivity through a conscious use of the tool; 3. To analyse those activities aimed at satisfying the customer’s explicit and impli...

  10. Handbook of natural language processing and machine translation DARPA global autonomous language exploitation

    CERN Document Server

    Olive, Joseph P; McCary, John

    2011-01-01

    This comprehensive handbook, written by leading experts in the field, details the groundbreaking research conducted under the breakthrough GALE program - The Global Autonomous Language Exploitation within the Defense Advanced Research Projects Agency (DARPA), while placing it in the context of previous research in the fields of natural language and signal processing, artificial intelligence and machine translation. The most fundamental contrast between GALE and its predecessor programs was its holistic integration of previously separate or sequential processes. In earlier language research pro

  11. Adaptation of machine translation for multilingual information retrieval in the medical domain.

    Science.gov (United States)

    Pecina, Pavel; Dušek, Ondřej; Goeuriot, Lorraine; Hajič, Jan; Hlaváčová, Jaroslava; Jones, Gareth J F; Kelly, Liadh; Leveling, Johannes; Mareček, David; Novák, Michal; Popel, Martin; Rosa, Rudolf; Tamchyna, Aleš; Urešová, Zdeňka

    2014-07-01

    We investigate machine translation (MT) of user search queries in the context of cross-lingual information retrieval (IR) in the medical domain. The main focus is on techniques to adapt MT to increase translation quality; however, we also explore MT adaptation to improve effectiveness of cross-lingual IR. Our MT system is Moses, a state-of-the-art phrase-based statistical machine translation system. The IR system is based on the BM25 retrieval model implemented in the Lucene search engine. The MT techniques employed in this work include in-domain training and tuning, intelligent training data selection, optimization of phrase table configuration, compound splitting, and exploiting synonyms as translation variants. The IR methods include morphological normalization and using multiple translation variants for query expansion. The experiments are performed and thoroughly evaluated on three language pairs: Czech-English, German-English, and French-English. MT quality is evaluated on data sets created within the Khresmoi project and IR effectiveness is tested on the CLEF eHealth 2013 data sets. The search query translation results achieved in our experiments are outstanding - our systems outperform not only our strong baselines, but also Google Translate and Microsoft Bing Translator in direct comparison carried out on all the language pairs. The baseline BLEU scores increased from 26.59 to 41.45 for Czech-English, from 23.03 to 40.82 for German-English, and from 32.67 to 40.82 for French-English. This is a 55% improvement on average. In terms of the IR performance on this particular test collection, a significant improvement over the baseline is achieved only for French-English. For Czech-English and German-English, the increased MT quality does not lead to better IR results. Most of the MT techniques employed in our experiments improve MT of medical search queries. Especially the intelligent training data selection proves to be very successful for domain adaptation of

  12. Translational approach for gene therapy in epilepsy

    DEFF Research Database (Denmark)

    Ledri, Litsa Nikitidou; Melin, Esbjörn; Christiansen, Søren H.

    2016-01-01

    clinical trial for gene therapy of temporal lobe epilepsy was explored: We investigated (i) whether the post intrahippocampal kainate-induced status epilepticus (SE) model of chronic epilepsy in rats could be clinically relevant; and (ii) whether a translationally designed neuropeptide Y (NPY)/Y2 receptor...

  13. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  14. Understanding the organization of cognitive approaches to translation

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Cognitive approaches to translation studies are driven by three interrelated aims: to understand the structure and organization of the capacities of cognitive agents involved in processes of translation, to build better theories and models of translation, and to develop more efficient methods...... theory, it is more descriptively adequate and more fruitful to understand it as a family of projects based on multiple theories that are relevant for studying different aspects of the translation process. This perspective allows us to extract the erotetic structure of these programs which are organized...

  15. Modularity Design Approach for Preventive Machine Maintenance

    Science.gov (United States)

    Ernawati, D.; Pudji, E.; Ngatilah, Y.; Handoyo, R.

    2018-01-01

    In a company, machine maintenance system will be very influential in production process activity. The company should have a scheduled engine maintenance system that does not require high costs when repairing and replacing machine parts. Modularity Design method is able to provide solutions to the engine maintenance scheduling system and can prevent fatal damage to the engine components. It can minimize the cost of repair and replacement of these machine components.The paper provides a solution to machine maintenance problems. The paper is also completed with case study of milling machines. That case studies can give us a real description about impact implementation of modularity design to prevent fatal damage to components and minimize the cost of repair and replacement of components of the machine.

  16. The cognitive approach to conscious machines

    CERN Document Server

    Haikonen, Pentti O

    2003-01-01

    Could a machine have an immaterial mind? The author argues that true conscious machines can be built, but rejects artificial intelligence and classical neural networks in favour of the emulation of the cognitive processes of the brain-the flow of inner speech, inner imagery and emotions. This results in a non-numeric meaning-processing machine with distributed information representation and system reactions. It is argued that this machine would be conscious; it would be aware of its own existence and its mental content and perceive this as immaterial. Novel views on consciousness and the mind-

  17. Our Policies, Their Text: German Language Students' Strategies with and Beliefs about Web-Based Machine Translation

    Science.gov (United States)

    White, Kelsey D.; Heidrich, Emily

    2013-01-01

    Most educators are aware that some students utilize web-based machine translators for foreign language assignments, however, little research has been done to determine how and why students utilize these programs, or what the implications are for language learning and teaching. In this mixed-methods study we utilized surveys, a translation task,…

  18. Addictions Neuroclinical Assessment: A reverse translational approach.

    Science.gov (United States)

    Kwako, Laura E; Momenan, Reza; Grodin, Erica N; Litten, Raye Z; Koob, George F; Goldman, David

    2017-08-01

    Incentive salience, negative emotionality, and executive function are functional domains that are etiologic in the initiation and progression of addictive disorders, having been implicated in humans with addictive disorders and in animal models of addictions. Measures of these three neuroscience-based functional domains can capture much of the effects of inheritance and early exposures that lead to trait vulnerability shared across different addictive disorders. For specific addictive disorders, these measures can be supplemented by agent specific measures such as those that access pharmacodynamic and pharmacokinetic variation attributable to agent-specific gatekeeper molecules including receptors and drug-metabolizing enzymes. Herein, we focus on the translation and reverse translation of knowledge derived from animal models of addiction to the human condition via measures of neurobiological processes that are orthologous in animals and humans, and that are shared in addictions to different agents. Based on preclinical data and human studies, measures of these domains in a general framework of an Addictions Neuroclinical Assessment (ANA) can transform the assessment and nosology of addictive disorders, and can be informative for staging disease progression. We consider next steps and challenges for implementation of ANA in clinical care and research. This article is part of the Special Issue entitled "Alcoholism". Published by Elsevier Ltd.

  19. Intelligent Machine Learning Approaches for Aerospace Applications

    Science.gov (United States)

    Sathyan, Anoop

    Machine Learning is a type of artificial intelligence that provides machines or networks the ability to learn from data without the need to explicitly program them. There are different kinds of machine learning techniques. This thesis discusses the applications of two of these approaches: Genetic Fuzzy Logic and Convolutional Neural Networks (CNN). Fuzzy Logic System (FLS) is a powerful tool that can be used for a wide variety of applications. FLS is a universal approximator that reduces the need for complex mathematics and replaces it with expert knowledge of the system to produce an input-output mapping using If-Then rules. The expert knowledge of a system can help in obtaining the parameters for small-scale FLSs, but for larger networks we will need to use sophisticated approaches that can automatically train the network to meet the design requirements. This is where Genetic Algorithms (GA) and EVE come into the picture. Both GA and EVE can tune the FLS parameters to minimize a cost function that is designed to meet the requirements of the specific problem. EVE is an artificial intelligence developed by Psibernetix that is trained to tune large scale FLSs. The parameters of an FLS can include the membership functions and rulebase of the inherent Fuzzy Inference Systems (FISs). The main issue with using the GFS is that the number of parameters in a FIS increase exponentially with the number of inputs thus making it increasingly harder to tune them. To reduce this issue, the FLSs discussed in this thesis consist of 2-input-1-output FISs in cascade (Chapter 4) or as a layer of parallel FISs (Chapter 7). We have obtained extremely good results using GFS for different applications at a reduced computational cost compared to other algorithms that are commonly used to solve the corresponding problems. In this thesis, GFSs have been designed for controlling an inverted double pendulum, a task allocation problem of clustering targets amongst a set of UAVs, a fire

  20. Preliminary study of online machine translation use of nursing literature: quality evaluation and perceived usability

    Directory of Open Access Journals (Sweden)

    Anazawa Ryoko

    2012-11-01

    Full Text Available Abstract Background Japanese nurses are increasingly required to read published international research in clinical, educational, and research settings. Language barriers are a significant obstacle, and online machine translation (MT is a tool that can be used to address this issue. We examined the quality of Google Translate® (English to Japanese and Korean to Japanese, which is a representative online MT, using a previously verified evaluation method. We also examined the perceived usability and current use of online MT among Japanese nurses. Findings Randomly selected nursing abstracts were translated and then evaluated for intelligibility and usability by 28 participants, including assistants and research associates from nursing universities throughout Japan. They answered a questionnaire about their online MT use. From simple comparison of mean scores between two language pairs, translation quality was significantly better, with respect to both intelligibility and usability, for Korean-Japanese than for English-Japanese. Most respondents perceived a language barrier. Online MT had been used by 61% of the respondents and was perceived as not useful enough. Conclusion Nursing articles translated from Korean into Japanese by an online MT system could be read at an acceptable level of comprehension, but the same could not be said for English-Japanese translations. Respondents with experience using online MT used it largely to grasp the overall meanings of the original text. Enrichment in technical terms appeared to be the key to better usability. Users will be better able to use MT outputs if they improve their foreign language proficiency as much as possible. Further research is being conducted with a larger sample size and detailed analysis.

  1. Breaking the language barrier: machine assisted diagnosis using the medical speech translator.

    Science.gov (United States)

    Starlander, Marianne; Bouillon, Pierrette; Rayner, Manny; Chatzichrisafis, Nikos; Hockey, Beth Ann; Isahara, Hitoshi; Kanzaki, Kyoko; Nakao, Yukie; Santaholma, Marianne

    2005-01-01

    In this paper, we describe and evaluate an Open Source medical speech translation system (MedSLT) intended for safety-critical applications. The aim of this system is to eliminate the language barriers in emergency situation. It translates spoken questions from English into French, Japanese and Finnish in three medical subdomains (headache, chest pain and abdominal pain), using a vocabulary of about 250-400 words per sub-domain. The architecture is a compromise between fixed-phrase translation on one hand and complex linguistically-based systems on the other. Recognition is guided by a Context Free Grammar Language Model compiled from a general unification grammar, automatically specialised for the domain. We present an evaluation of this initial prototype that shows the advantages of this grammar-based approach for this particular translation task in term of both reliability and use.

  2. Extracting Date/Time Expressions in Super-Function Based Japanese-English Machine Translation

    Science.gov (United States)

    Sasayama, Manabu; Kuroiwa, Shingo; Ren, Fuji

    Super-Function Based Machine Translation(SFBMT) which is a type of Example-Based Machine Translation has a feature which makes it possible to expand the coverage of examples by changing nouns into variables, however, there were problems extracting entire date/time expressions containing parts-of-speech other than nouns, because only nouns/numbers were changed into variables. We describe a method for extracting date/time expressions for SFBMT. SFBMT uses noun determination rules to extract nouns and a bilingual dictionary to obtain correspondence of the extracted nouns between the source and the target languages. In this method, we add a rule to extract date/time expressions and then extract date/time expressions from a Japanese-English bilingual corpus. The evaluation results shows that the precision of this method for Japanese sentences is 96.7%, with a recall of 98.2% and the precision for English sentences is 94.7%, with a recall of 92.7%.

  3. Translational research-the need of a new bioethics approach.

    Science.gov (United States)

    Hostiuc, Sorin; Moldoveanu, Alin; Dascălu, Maria-Iuliana; Unnthorsson, Runar; Jóhannesson, Ómar I; Marcus, Ioan

    2016-01-15

    Translational research tries to apply findings from basic science to enhance human health and well-being. Many phases of the translational research may include non-medical tasks (information technology, engineering, nanotechnology, biochemistry, animal research, economy, sociology, psychology, politics, and so on). Using common bioethics principles to these areas might sometimes be not feasible, or even impossible. However, the whole process must respect some fundamental, moral principles. The purpose of this paper is to argument the need for a different approach to the morality in translational bioethics, and to suggest some directions that might be followed when constructing such a bioethics. We will show that a new approach is needed and present a few ethical issues that are specific to the translational research.

  4. Aligning qualitative and quantitative approaches in professional translation quality assessment

    OpenAIRE

    Martínez Mateo, Roberto

    2016-01-01

    Translation Quality Assessment in professional translation is a long-debated issue that is still unsettled today, partly, due to the wide range of possible approaches. Given the elusive nature of the quality concept, first, it must be defined from a multifaceted and all-embracing viewpoint. Simultaneously and from a textual perspective, the quality notion must be defined as a notion of relative (and not absolute) adequacy with respect to a framework previously agreed by parties at...

  5. Technology for Large-Scale Translation of Clinical Practice Guidelines: A Pilot Study of the Performance of a Hybrid Human and Computer-Assisted Approach.

    Science.gov (United States)

    Van de Velde, Stijn; Macken, Lieve; Vanneste, Koen; Goossens, Martine; Vanschoenbeek, Jan; Aertgeerts, Bert; Vanopstal, Klaar; Vander Stichele, Robert; Buysschaert, Joost

    2015-10-09

    The construction of EBMPracticeNet, a national electronic point-of-care information platform in Belgium, began in 2011 to optimize quality of care by promoting evidence-based decision making. The project involved, among other tasks, the translation of 940 EBM Guidelines of Duodecim Medical Publications from English into Dutch and French. Considering the scale of the translation process, it was decided to make use of computer-aided translation performed by certificated translators with limited expertise in medical translation. Our consortium used a hybrid approach, involving a human translator supported by a translation memory (using SDL Trados Studio), terminology recognition (using SDL MultiTerm terminology databases) from medical terminology databases, and support from online machine translation. This resulted in a validated translation memory, which is now in use for the translation of new and updated guidelines. The objective of this experiment was to evaluate the performance of the hybrid human and computer-assisted approach in comparison with translation unsupported by translation memory and terminology recognition. A comparison was also made with the translation efficiency of an expert medical translator. We conducted a pilot study in which two sets of 30 new and 30 updated guidelines were randomized to one of three groups. Comparable guidelines were translated (1) by certificated junior translators without medical specialization using the hybrid method, (2) by an experienced medical translator without this support, and (3) by the same junior translators without the support of the validated translation memory. A medical proofreader who was blinded for the translation procedure, evaluated the translated guidelines for acceptability and adequacy. Translation speed was measured by recording translation and post-editing time. The human translation edit rate was calculated as a metric to evaluate the quality of the translation. A further evaluation was made of

  6. Integrated Features by Administering the Support Vector Machine (SVM of Translational Initiations Sites in Alternative Polymorphic Contex

    Directory of Open Access Journals (Sweden)

    Nurul Arneida Husin

    2012-04-01

    Full Text Available Many algorithms and methods have been proposed for classification problems in bioinformatics. In this study, the discriminative approach in particular support vector machines (SVM is employed to recognize the studied TIS patterns. The applied discriminative approach is used to learn about some discriminant functions of samples that have been labelled as positive or negative. After learning, the discriminant functions are employed to decide whether a new sample is true or false. In this study, support vector machines (SVM is employed to recognize the patterns for studied translational initiation sites in alternative weak context. The method has been optimized with the best parameters selected; c=100, E=10-6 and ex=2 for non linear kernel function. Results show that with top 5 features and non linear kernel, the best prediction accuracy achieved is 95.8%. J48 algorithm is applied to compare with SVM with top 15 features and the results show a good prediction accuracy of 95.8%. This indicates that the top 5 features selected by the IGR method and that are performed by SVM are sufficient to use in the prediction of TIS in weak contexts.

  7. a Perturbation Approach to Translational Gravity

    Science.gov (United States)

    Julve, J.; Tiemblo, A.

    2013-05-01

    Within a gauge formulation of 3+1 gravity relying on a nonlinear realization of the group of isometries of space-time, a natural expansion of the metric tensor arises and a simple choice of the gravity dynamical variables is possible. We show that the expansion parameter can be identified with the gravitational constant and that the first-order depends only on a diagonal matrix in the ensuing perturbation approach. The explicit first-order solution is calculated in the static isotropic case, and its general structure is worked out in the harmonic gauge.

  8. Machine learning: novel bioinformatics approaches for combating antimicrobial resistance.

    Science.gov (United States)

    Macesic, Nenad; Polubriaginof, Fernanda; Tatonetti, Nicholas P

    2017-12-01

    Antimicrobial resistance (AMR) is a threat to global health and new approaches to combating AMR are needed. Use of machine learning in addressing AMR is in its infancy but has made promising steps. We reviewed the current literature on the use of machine learning for studying bacterial AMR. The advent of large-scale data sets provided by next-generation sequencing and electronic health records make applying machine learning to the study and treatment of AMR possible. To date, it has been used for antimicrobial susceptibility genotype/phenotype prediction, development of AMR clinical decision rules, novel antimicrobial agent discovery and antimicrobial therapy optimization. Application of machine learning to studying AMR is feasible but remains limited. Implementation of machine learning in clinical settings faces barriers to uptake with concerns regarding model interpretability and data quality.Future applications of machine learning to AMR are likely to be laboratory-based, such as antimicrobial susceptibility phenotype prediction.

  9. Predicting Post-Translational Modifications from Local Sequence Fragments Using Machine Learning Algorithms: Overview and Best Practices.

    Science.gov (United States)

    Tatjewski, Marcin; Kierczak, Marcin; Plewczynski, Dariusz

    2017-01-01

    Here, we present two perspectives on the task of predicting post translational modifications (PTMs) from local sequence fragments using machine learning algorithms. The first is the description of the fundamental steps required to construct a PTM predictor from the very beginning. These steps include data gathering, feature extraction, or machine-learning classifier selection. The second part of our work contains the detailed discussion of more advanced problems which are encountered in PTM prediction task. Probably the most challenging issues which we have covered here are: (1) how to address the training data class imbalance problem (we also present statistics describing the problem); (2) how to properly set up cross-validation folds with an approach which takes into account the homology of protein data records, to address this problem we present our folds-over-clusters algorithm; and (3) how to efficiently reach for new sources of learning features. Presented techniques and notes resulted from intense studies in the field, performed by our and other groups, and can be useful both for researchers beginning in the field of PTM prediction and for those who want to extend the repertoire of their research techniques.

  10. Proposal and Evaluation of Sequencing Words in Chat Conversation between Japanese and Chinese using Machine Translation

    OpenAIRE

    李, 芬慧; 由井薗, 隆也

    2010-01-01

    日中翻訳チャットにおいて単語を並べた会話によるチャットコミュニケーションを提案する.比較評価のために,通常の文章チャットによる評価実験も行った.その結果,日中翻訳チャットにおいて,(1)単語チャットは会話速度や会話内容の理解において文章チャットと同等に使えること,(2)利用者は,単語チャットよりは文章チャットを好む傾向があること,(3)翻訳された会話の理解は日本人と中国人とで文化的違いがある可能性が得られた.今後は単語チャットの応用を検討する予定である. : We propose a chat conversation between Japanese and Chinese using machine translation by sequencing words. By comparison with a conventional chat using machine translation, it is showed that (1) sequencing words in the chat is as same speed and understanding as the...

  11. Report on Approaches to Database Translation. Final Report.

    Science.gov (United States)

    Gallagher, Leonard; Salazar, Sandra

    This report describes approaches to database translation (i.e., transferring data and data definitions from a source, either a database management system (DBMS) or a batch file, to a target DBMS), and recommends a method for representing the data structures of newly-proposed network and relational data models in a form suitable for database…

  12. Let the Game Begin: Ergodic as an Approach for Video Game Translation

    OpenAIRE

    SF. Lukfianka Sanjaya Purnama; SF. Luthfie Arguby Purnomo; Dyah Nugrahani

    2016-01-01

    This paper attempts to propose ergodic as an approach for video game translation. The word approach here refers to an approach for translation products and to an approach for the translation process. The steps to formulate ergodic as an approach are first, Aarseth’sergodic literature is reviewed to elicit a basis for comprehension toward its relationship with video games and video game translation Secondly, taking the translation of Electronic Arts’Need for Speed: Own the City, Midway’s Morta...

  13. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  14. Remote leak localization approach for fusion machines

    International Nuclear Information System (INIS)

    Durocher, Au.; Bruno, V.; Chantant, M.; Gargiulo, L.; Gherman, T.; Hatchressian, J.-C.; Houry, M.; Le, R.; Mouyon, D.

    2013-01-01

    Highlights: ► Description of leaks issue. ► Selection of leak localization concepts. ► Qualification of leak localization concepts. -- Abstract: Fusion machine operation requires high-vacuum conditions and does not tolerate water or gas leak in the vacuum vessels, even if they are micrometric. Tore Supra, as a fully actively cooled tokamak, has got a large leak management experience; 34 water leaks occurred since the beginning of its operation in 1988. To handle this issue, after preliminary machine protection phases, the current process for leak localization is based on water or helium pressurization network by network. It generally allows the identification of a set of components where the leakage element is located. However, the unique background of CEA-IRFM laboratory points needs of accuracy and promptness out in the leak localization process. Moreover, in-vessel interventions have to be performed trying to minimize time and risks for the persons. They are linked to access conditions, radioactivity, tracer gas high pressure and vessel conditioning. Remote operation will be one of the ways to improve these points on future fusion machines. In this case, leak sensors would have to be light weight devices in order to be integrated on a carrier or to be located outside with a sniffing process set up. A leak localization program is on-going at CEA-IRFM Laboratory with the first goal of identifying and characterizing relevant concepts to localize helium or water leaks on ITER. In the same time, CEA has developed robotic carrier for effective in-vessel intervention in a hostile environment. Three major tests campaigns with the goal to identify leak sensors have been achieved on several CEA test-beds since 2010. Very promising results have been obtained: relevant scenario of leak localization performed, concepts tested in a high volume test-bed called TITAN, and, in several conditions of pressure and temperature (ultrahigh vacuum to atmospheric pressure and 20

  15. Remote leak localization approach for fusion machines

    Energy Technology Data Exchange (ETDEWEB)

    Durocher, Au., E-mail: aurelien.durocher@cea.fr [CEA-IRFM, F-13108 Saint Paul-Lez-Durance (France); Bruno, V.; Chantant, M.; Gargiulo, L. [CEA-IRFM, F-13108 Saint Paul-Lez-Durance (France); Gherman, T. [Floralis UJF Filiale, F-38610 Gières (France); Hatchressian, J.-C.; Houry, M.; Le, R.; Mouyon, D. [CEA-IRFM, F-13108 Saint Paul-Lez-Durance (France)

    2013-10-15

    Highlights: ► Description of leaks issue. ► Selection of leak localization concepts. ► Qualification of leak localization concepts. -- Abstract: Fusion machine operation requires high-vacuum conditions and does not tolerate water or gas leak in the vacuum vessels, even if they are micrometric. Tore Supra, as a fully actively cooled tokamak, has got a large leak management experience; 34 water leaks occurred since the beginning of its operation in 1988. To handle this issue, after preliminary machine protection phases, the current process for leak localization is based on water or helium pressurization network by network. It generally allows the identification of a set of components where the leakage element is located. However, the unique background of CEA-IRFM laboratory points needs of accuracy and promptness out in the leak localization process. Moreover, in-vessel interventions have to be performed trying to minimize time and risks for the persons. They are linked to access conditions, radioactivity, tracer gas high pressure and vessel conditioning. Remote operation will be one of the ways to improve these points on future fusion machines. In this case, leak sensors would have to be light weight devices in order to be integrated on a carrier or to be located outside with a sniffing process set up. A leak localization program is on-going at CEA-IRFM Laboratory with the first goal of identifying and characterizing relevant concepts to localize helium or water leaks on ITER. In the same time, CEA has developed robotic carrier for effective in-vessel intervention in a hostile environment. Three major tests campaigns with the goal to identify leak sensors have been achieved on several CEA test-beds since 2010. Very promising results have been obtained: relevant scenario of leak localization performed, concepts tested in a high volume test-bed called TITAN, and, in several conditions of pressure and temperature (ultrahigh vacuum to atmospheric pressure and 20

  16. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    Science.gov (United States)

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  17. A meta-composite software development approach for translational research.

    Science.gov (United States)

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  18. Man-machine analysis of translation and work tasks of Skylab films

    Science.gov (United States)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  19. Use of Online Machine Translation for Nursing Literature: A Questionnaire-Based Survey

    Science.gov (United States)

    Anazawa, Ryoko; Ishikawa, Hirono; Takahiro, Kiuchi

    2013-01-01

    Background: The language barrier is a significant obstacle for nurses who are not native English speakers to obtain information from international journals. Freely accessible online machine translation (MT) offers a possible solution to this problem. Aim: To explore how Japanese nursing professionals use online MT and perceive its usability in reading English articles and to discuss what should be considered for better utilisation of online MT lessening the language barrier. Method: In total, 250 randomly selected assistants and research associates at nursing colleges across Japan answered a questionnaire examining the current use of online MT and perceived usability among Japanese nurses, along with the number of articles read in English and the perceived language barrier. The items were rated on Likert scales, and t-test, ANOVA, chi-square test, and Spearman’s correlation were used for analyses. Results: Of the participants, 73.8% had used online MT. More than half of them felt it was usable. The language barrier was strongly felt, and academic degrees and English proficiency level were associated factors. The perceived language barrier was related to the frequency of online MT use. No associated factor was found for the perceived usability of online MT. Conclusion: Language proficiency is an important factor for optimum utilisation of MT. A need for education in the English language, reading scientific papers, and online MT training was indicated. Cooperation with developers and providers of MT for the improvement of their systems is required. PMID:23459140

  20. An Approach for Implementing State Machines with Online Testability

    Directory of Open Access Journals (Sweden)

    P. K. Lala

    2010-01-01

    Full Text Available During the last two decades, significant amount of research has been performed to simplify the detection of transient or soft errors in VLSI-based digital systems. This paper proposes an approach for implementing state machines that uses 2-hot code for state encoding. State machines designed using this approach allow online detection of soft errors in registers and output logic. The 2-hot code considerably reduces the number of required flip-flops and leads to relatively straightforward implementation of next state and output logic. A new way of designing output logic for online fault detection has also been presented.

  1. Indirect Tire Monitoring System - Machine Learning Approach

    Science.gov (United States)

    Svensson, O.; Thelin, S.; Byttner, S.; Fan, Y.

    2017-10-01

    The heavy vehicle industry has today no requirement to provide a tire pressure monitoring system by law. This has created issues surrounding unknown tire pressure and thread depth during active service. There is also no standardization for these kind of systems which means that different manufacturers and third party solutions work after their own principles and it can be hard to know what works for a given vehicle type. The objective is to create an indirect tire monitoring system that can generalize a method that detect both incorrect tire pressure and thread depth for different type of vehicles within a fleet without the need for additional physical sensors or vehicle specific parameters. The existing sensors that are connected communicate through CAN and are interpreted by the Drivec Bridge hardware that exist in the fleet. By using supervised machine learning a classifier was created for each axle where the main focus was the front axle which had the most issues. The classifier will classify the vehicles tires condition and will be implemented in Drivecs cloud service where it will receive its data. The resulting classifier is a random forest implemented in Python. The result from the front axle with a data set consisting of 9767 samples of buses with correct tire condition and 1909 samples of buses with incorrect tire condition it has an accuracy of 90.54% (0.96%). The data sets are created from 34 unique measurements from buses between January and May 2017. This classifier has been exported and is used inside a Node.js module created for Drivecs cloud service which is the result of the whole implementation. The developed solution is called Indirect Tire Monitoring System (ITMS) and is seen as a process. This process will predict bad classes in the cloud which will lead to warnings. The warnings are defined as incidents. They contain only the information needed and the bandwidth of the incidents are also controlled so incidents are created within an

  2. ProLanGO: Protein Function Prediction Using Neural Machine Translation Based on a Recurrent Neural Network.

    Science.gov (United States)

    Cao, Renzhi; Freitas, Colton; Chan, Leong; Sun, Miao; Jiang, Haiqing; Chen, Zhangxin

    2017-10-17

    With the development of next generation sequencing techniques, it is fast and cheap to determine protein sequences but relatively slow and expensive to extract useful information from protein sequences because of limitations of traditional biological experimental techniques. Protein function prediction has been a long standing challenge to fill the gap between the huge amount of protein sequences and the known function. In this paper, we propose a novel method to convert the protein function problem into a language translation problem by the new proposed protein sequence language "ProLan" to the protein function language "GOLan", and build a neural machine translation model based on recurrent neural networks to translate "ProLan" language to "GOLan" language. We blindly tested our method by attending the latest third Critical Assessment of Function Annotation (CAFA 3) in 2016, and also evaluate the performance of our methods on selected proteins whose function was released after CAFA competition. The good performance on the training and testing datasets demonstrates that our new proposed method is a promising direction for protein function prediction. In summary, we first time propose a method which converts the protein function prediction problem to a language translation problem and applies a neural machine translation model for protein function prediction.

  3. System approach to machine building enterprise innovative activity management

    Directory of Open Access Journals (Sweden)

    І.V. Levytska

    2016-12-01

    Full Text Available The company, which operates in a challenging competitive environment should focus on new products and provide innovative services that enhance their innovation to maintain the company’s market position. The article deals with the peculiarities of such an activity in the company. The authors analyze the various approaches used in the management, and supply the advantages and disadvantages of each. It is determine that the most optimal approach among them is a system approach. The definition of the consepts "a system" and "a systematic approach to innovative activity management" are suggested. The article works out the system of machine building enterprise innovative activity management, the organization of machine building enterprise innovative activity; the planning of machine building enterprise innovative activity; the control in the system of machine building enterprise innovative activity management; the elements of the control subsystem. The properties, typical for the system of innovative management, are supplied. The managers, engaged in enterprise innovative activity management, must perform a number of the suggested tasks, which affect the efficiency of the enterprise as a whole. These exact tasks are performed using the systematic approach, providing the enterprise competitive operation and quick adaptation to changes in the external environment.

  4. A Comparison of Machine Learning Approaches for Corn Yield Estimation

    Science.gov (United States)

    Kim, N.; Lee, Y. W.

    2017-12-01

    Machine learning is an efficient empirical method for classification and prediction, and it is another approach to crop yield estimation. The objective of this study is to estimate corn yield in the Midwestern United States by employing the machine learning approaches such as the support vector machine (SVM), random forest (RF), and deep neural networks (DNN), and to perform the comprehensive comparison for their results. We constructed the database using satellite images from MODIS, the climate data of PRISM climate group, and GLDAS soil moisture data. In addition, to examine the seasonal sensitivities of corn yields, two period groups were set up: May to September (MJJAS) and July and August (JA). In overall, the DNN showed the highest accuracies in term of the correlation coefficient for the two period groups. The differences between our predictions and USDA yield statistics were about 10-11 %.

  5. Edu-mining: A Machine Learning Approach

    Science.gov (United States)

    Srimani, P. K.; Patil, Malini M.

    2011-12-01

    Mining Educational data is an emerging interdisciplinary research area that mainly deals with the development of methods to explore the data stored in educational institutions. The educational data is referred as Edu-DATA. Queries related to Edu-DATA are of practical interest as SQL approach is insufficient and needs to be focused in a different way. The paper aims at developing a technique called Edu-MINING which converts raw data coming from educational institutions using data mining techniques into useful information. The discovered knowledge will have a great impact on the educational research and practices. Edu-MINING explores Edu-DATA, discovers new knowledge and suggests useful methods to improve the quality of education with regard to teaching-learning process. This is illustrated through a case study.

  6. The translational study of apathy – an ecological approach

    Directory of Open Access Journals (Sweden)

    Flurin eCathomas

    2015-09-01

    Full Text Available Apathy, a quantitative reduction in goal-directed behavior, is a prevalent symptom dimension with a negative impact on functional outcome in various neuropsychiatric disorders including schizophrenia and depression. The aim of this review is to show that interview-based assessment of apathy in humans and observation of spontaneous rodent behavior in an ecological setting can serve as an important complementary approach to already existing task-based assessment, to study and understand the neurobiological bases of apathy. We first discuss the paucity of current translational approaches regarding animal equivalents of psychopathological assessment of apathy. We then present the existing evaluation scales for the assessment of apathy in humans and propose five sub-domains of apathy, namely self-care, social interaction, exploration, work/education and recreation. Each of the items in apathy evaluation scales can be assigned to one of these sub-domains. We then show that corresponding, well-validated behavioral readouts exist for rodents and that, indeed, three of the five human apathy sub-domains have a rodent equivalent. In conclusion, the translational ecological study of apathy in humans and mice is possible and will constitute an important approach to increase the understanding of the neurobiological bases of apathy and the development of novel treatments.

  7. Comparative Human and Automatic Evaluation of Glass-Box and Black-Box Approaches to Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Torregrosa Daniel

    2017-06-01

    Full Text Available Interactive translation prediction (ITP is a modality of computer-aided translation that assists professional translators by offering context-based computer-generated continuation suggestions as they type. While most state-of-the-art ITP systems follow a glass-box approach, meaning that they are tightly coupled to an adapted machine translation system, a black-box approach which does not need access to the inner workings of the bilingual resources used to generate the suggestions has been recently proposed in the literature: this new approach allows new sources of bilingual information to be included almost seamlessly. In this paper, we compare for the first time the glass-box and the black-box approaches by means of an automatic evaluation of translation tasks between related languages such as English–Spanish and unrelated ones such as Arabic–English and English–Chinese, showing that, with our setup, 20%–50% of keystrokes could be saved using either method and that the black-box approach outperformed the glass-box one in five out of six scenarios operating under similar conditions. We also performed a preliminary human evaluation of English to Spanish translation for both approaches. On average, the evaluators saved 10% keystrokes and were 4% faster with the black-box approach, and saved 15% keystrokes and were 12% slower with the glass-box one; but they could have saved 51% and 69% keystrokes respectively if they had used all the compatible suggestions. Users felt the suggestions helped them to translate faster and easier. All the tools used to perform the evaluation are available as free/open–source software.

  8. Translational ethics: an analytical framework of translational movements between theory and practice and a sketch of a comprehensive approach.

    Science.gov (United States)

    Bærøe, Kristine

    2014-09-30

    , carefully designed, overall approaches combining justified, self-reflexive philosophical and practical efforts according to the suggested distinctions could be expected to realise - or at least improve a facilitation of - translation of ethics across the theory-practice gap.

  9. Partnering with patients in translational oncology research: ethical approach.

    Science.gov (United States)

    Mamzer, Marie-France; Duchange, Nathalie; Darquy, Sylviane; Marvanne, Patrice; Rambaud, Claude; Marsico, Giovanna; Cerisey, Catherine; Scotté, Florian; Burgun, Anita; Badoual, Cécile; Laurent-Puig, Pierre; Hervé, Christian

    2017-04-08

    The research program CARPEM (cancer research and personalized medicine) brings together the expertise of researchers and hospital-based oncologists to develop translational research in the context of personalized or "precision" medicine for cancer. There is recognition that patient involvement can help to take into account their needs and priorities in the development of this emerging practice but there is currently no consensus about how this can be achieved. In this study, we developed an empirical ethical research action aiming to improve patient representatives' involvement in the development of the translational research program together with health professionals. The aim is to promote common understanding and sharing of knowledge between all parties and to establish a long-term partnership integrating patient's expectations. Two distinct committees were settled in CARPEM: an "Expert Committee", gathering healthcare and research professionals, and a "Patient Committee", gathering patients and patient representatives. A multidisciplinary team trained in medical ethics research ensured communication between the two committees as well as analysis of discussions, minutes and outputs from all stakeholders. The results highlight the efficiency of the transfer of knowledge between interested parties. Patient representatives and professionals were able to identify new ethical challenges and co-elaborate new procedures to gather information and consent forms for adapting to practices and recommendations developed during the process. Moreover, included patient representatives became full partners and participated in the transfer of knowledge to the public via conferences and publications. Empirical ethical research based on a patient-centered approach could help in establishing a fair model for coordination and support actions during cancer research, striking a balance between the regulatory framework, researcher needs and patient expectations. Our approach addresses

  10. A machine learning approach to the accurate prediction of monitor units for a compact proton machine.

    Science.gov (United States)

    Sun, Baozhou; Lam, Dao; Yang, Deshan; Grantham, Kevin; Zhang, Tiezhi; Mutic, Sasa; Zhao, Tianyu

    2018-05-01

    Clinical treatment planning systems for proton therapy currently do not calculate monitor units (MUs) in passive scatter proton therapy due to the complexity of the beam delivery systems. Physical phantom measurements are commonly employed to determine the field-specific output factors (OFs) but are often subject to limited machine time, measurement uncertainties and intensive labor. In this study, a machine learning-based approach was developed to predict output (cGy/MU) and derive MUs, incorporating the dependencies on gantry angle and field size for a single-room proton therapy system. The goal of this study was to develop a secondary check tool for OF measurements and eventually eliminate patient-specific OF measurements. The OFs of 1754 fields previously measured in a water phantom with calibrated ionization chambers and electrometers for patient-specific fields with various range and modulation width combinations for 23 options were included in this study. The training data sets for machine learning models in three different methods (Random Forest, XGBoost and Cubist) included 1431 (~81%) OFs. Ten-fold cross-validation was used to prevent "overfitting" and to validate each model. The remaining 323 (~19%) OFs were used to test the trained models. The difference between the measured and predicted values from machine learning models was analyzed. Model prediction accuracy was also compared with that of the semi-empirical model developed by Kooy (Phys. Med. Biol. 50, 2005). Additionally, gantry angle dependence of OFs was measured for three groups of options categorized on the selection of the second scatters. Field size dependence of OFs was investigated for the measurements with and without patient-specific apertures. All three machine learning methods showed higher accuracy than the semi-empirical model which shows considerably large discrepancy of up to 7.7% for the treatment fields with full range and full modulation width. The Cubist-based solution

  11. A control approach for plasma density in tokamak machines

    Energy Technology Data Exchange (ETDEWEB)

    Boncagni, Luca, E-mail: luca.boncagni@enea.it [EURATOM – ENEA Fusion Association, Frascati Research Center, Division of Fusion Physics, Rome, Frascati (Italy); Pucci, Daniele; Piesco, F.; Zarfati, Emanuele [Dipartimento di Ingegneria Informatica, Automatica e Gestionale ' ' Antonio Ruberti' ' , Sapienza Università di Roma (Italy); Mazzitelli, G. [EURATOM – ENEA Fusion Association, Frascati Research Center, Division of Fusion Physics, Rome, Frascati (Italy); Monaco, S. [Dipartimento di Ingegneria Informatica, Automatica e Gestionale ' ' Antonio Ruberti' ' , Sapienza Università di Roma (Italy)

    2013-10-15

    Highlights: •We show a control approach for line plasma density in tokamak. •We show a control approach for pressure in a tokamak chamber. •We show experimental results using one valve. -- Abstract: In tokamak machines, chamber pre-fill is crucial to attain plasma breakdown, while plasma density control is instrumental for several tasks such as machine protection and achievement of desired plasma performances. This paper sets the principles of a new control strategy for attaining both chamber pre-fill and plasma density regulation. Assuming that the actuation mean is a piezoelectric valve driven by a varying voltage, the proposed control laws ensure convergence to reference values of chamber pressure during pre-fill, and of plasma density during plasma discharge. Experimental results at FTU are presented to discuss weaknesses and strengths of the proposed control strategy. The whole system has been implemented by using the MARTe framework [1].

  12. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    Science.gov (United States)

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  13. Interdisciplinarity in translation teaching: competence-based education, translation task-based approach, context-based text typology

    Directory of Open Access Journals (Sweden)

    Edelweiss Vitol Gysel

    2017-05-01

    Full Text Available In the context of competence-based teaching, this paper draws upon the model of Translation Competence (TC put forward by the PACTE group (2003 to establish a dialogue between cognitive-constructivist paradigms for translation teaching and the model of the Context-based Text Typology (MATTHIESSEN et al., 2007. In this theoretical environment, it proposes a model for the design of a Teaching Unit (TU for the development of the bilingual competence in would-be-translators.To this end, it explores translation as a cognitive, communicative and textual activity (HURTADO ALBIR, 2011 and considers its teaching from the translation task-based approach (HURTADO ALBIR, 1999. This approach is illustrated through the practical example of the design of a TU elaborated for the subject ‘Introduction to Specialized Translation’,part of the curricular grid of the program ‘Secretariado Executivo’ at Universidade Federal de Santa Catarina. Aspects such as the establishment of learning objectives and their alignment with the translation tasks composing the TU are addressed for this specific pedagogical situation. We argue for the development of textual competences by means of the acquisition of strategies derived from the Context-based Text Typology to solve problems arising from the translation of different text types and contextual configurations.

  14. Translation in Light of Bilingual Mental Lexicon: A Psycholinguistic Approach

    Directory of Open Access Journals (Sweden)

    Congmin Zhao

    2018-05-01

    Full Text Available This paper gives insight into the translating process of second language learners in language use in light of the mechanism of bilingual mental lexicon. Structure and development of second language mental lexicon explains the existence of first language items and translation equivalents. Conversely translation can promote the construction of second language mental lexicon and ultimately second language acquisition.

  15. Ausdruckskraft und Regelmaessigkeit: Was Esperanto fuer automatische Uebersetzung geeignet macht (Expressiveness and Formal Regularity: What Makes Esperanto Suitable for Machine Translation).

    Science.gov (United States)

    Schubert, Klaus

    1988-01-01

    Describes DLT, the multilingual machine translation system that uses Esperanto as an intermediate language in which substantial portions of the translation subprocesses are carried out. The criteria for choosing an intermediate language and the reasons for preferring Esperanto over other languages are explained. (Author/DJD)

  16. Review: Current Approaches to Business and Institutional Translation. Proceedings of the International Conference on Economic, Business, Financial and Institutional Translation

    Directory of Open Access Journals (Sweden)

    Miguel Tolosa Igualada

    2016-08-01

    Full Text Available Daniel Gallego-Hernández (ed.. Current Approaches to Business and Institutional Translation. Proceedings of the International Conference on Economic, Business, Financial and Institutional Translation / Enfoques actuales en traducción económica e institucional. Actas del Congreso Internacional de Traducción Económica, Comercial, Financiera e Institucional. Suíça: Peter Lang, 2015, 254 páginas. ISBN 978-3-0343-1656-9.

  17. Machine learning approaches: from theory to application in schizophrenia.

    Science.gov (United States)

    Veronese, Elisa; Castellani, Umberto; Peruzzo, Denis; Bellani, Marcella; Brambilla, Paolo

    2013-01-01

    In recent years, machine learning approaches have been successfully applied for analysis of neuroimaging data, to help in the context of disease diagnosis. We provide, in this paper, an overview of recent support vector machine-based methods developed and applied in psychiatric neuroimaging for the investigation of schizophrenia. In particular, we focus on the algorithms implemented by our group, which have been applied to classify subjects affected by schizophrenia and healthy controls, comparing them in terms of accuracy results with other recently published studies. First we give a description of the basic terminology used in pattern recognition and machine learning. Then we separately summarize and explain each study, highlighting the main features that characterize each method. Finally, as an outcome of the comparison of the results obtained applying the described different techniques, conclusions are drawn in order to understand how much automatic classification approaches can be considered a useful tool in understanding the biological underpinnings of schizophrenia. We then conclude by discussing the main implications achievable by the application of these methods into clinical practice.

  18. Machine Learning Approaches: From Theory to Application in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Elisa Veronese

    2013-01-01

    Full Text Available In recent years, machine learning approaches have been successfully applied for analysis of neuroimaging data, to help in the context of disease diagnosis. We provide, in this paper, an overview of recent support vector machine-based methods developed and applied in psychiatric neuroimaging for the investigation of schizophrenia. In particular, we focus on the algorithms implemented by our group, which have been applied to classify subjects affected by schizophrenia and healthy controls, comparing them in terms of accuracy results with other recently published studies. First we give a description of the basic terminology used in pattern recognition and machine learning. Then we separately summarize and explain each study, highlighting the main features that characterize each method. Finally, as an outcome of the comparison of the results obtained applying the described different techniques, conclusions are drawn in order to understand how much automatic classification approaches can be considered a useful tool in understanding the biological underpinnings of schizophrenia. We then conclude by discussing the main implications achievable by the application of these methods into clinical practice.

  19. ASPECTS REGARDING THE METHOD OF REALIZING THE TECHNICAL EXPERTISE FOR REPAIRING THE TRANSLATION MECHANISM OF A M4A COAL-MINING MACHINE

    Directory of Open Access Journals (Sweden)

    Marius Liviu CÎRȚÎNĂ

    2018-05-01

    Full Text Available This paper presents the technical state of the mechanism of translation of the coalmining machine after the technical expertise. The rehabilitation to which the translation mechanism will be subjected will be carried out by performing the intervention works that will bring back into the normal operating parameters both the structural part and the functional part. The paper presents: the proposed solutions for repair after verification of the translation mechanism and the way of repairing the mechanism.

  20. Amp: A modular approach to machine learning in atomistic simulations

    Science.gov (United States)

    Khorshidi, Alireza; Peterson, Andrew A.

    2016-10-01

    Electronic structure calculations, such as those employing Kohn-Sham density functional theory or ab initio wavefunction theories, have allowed for atomistic-level understandings of a wide variety of phenomena and properties of matter at small scales. However, the computational cost of electronic structure methods drastically increases with length and time scales, which makes these methods difficult for long time-scale molecular dynamics simulations or large-sized systems. Machine-learning techniques can provide accurate potentials that can match the quality of electronic structure calculations, provided sufficient training data. These potentials can then be used to rapidly simulate large and long time-scale phenomena at similar quality to the parent electronic structure approach. Machine-learning potentials usually take a bias-free mathematical form and can be readily developed for a wide variety of systems. Electronic structure calculations have favorable properties-namely that they are noiseless and targeted training data can be produced on-demand-that make them particularly well-suited for machine learning. This paper discusses our modular approach to atomistic machine learning through the development of the open-source Atomistic Machine-learning Package (Amp), which allows for representations of both the total and atom-centered potential energy surface, in both periodic and non-periodic systems. Potentials developed through the atom-centered approach are simultaneously applicable for systems with various sizes. Interpolation can be enhanced by introducing custom descriptors of the local environment. We demonstrate this in the current work for Gaussian-type, bispectrum, and Zernike-type descriptors. Amp has an intuitive and modular structure with an interface through the python scripting language yet has parallelizable fortran components for demanding tasks; it is designed to integrate closely with the widely used Atomic Simulation Environment (ASE), which

  1. A Systems Medicine Approach: Translating Emerging Science into Individualized Wellness

    Directory of Open Access Journals (Sweden)

    J. S. Bland

    2017-01-01

    Full Text Available In today’s aging society, more people are living with lifestyle-related noncommunicable diseases (NCDs such as cardiovascular disease, type 2 diabetes, obesity, and cancer. Numerous opinion-leader organizations recommend lifestyle medicine as the first-line approach in NCD prevention and treatment. However, there is a strong need for a personalized approach as “one-size-fits-all” public health recommendations have been insufficient in addressing the interindividual differences in the diverse populations. Advancement in systems biology and the “omics” technologies has allowed comprehensive analysis of how complex biological systems are impacted upon external perturbations (e.g., nutrition and exercise, and therefore is gradually pushing personalized lifestyle medicine toward reality. Clinicians and healthcare practitioners have a unique opportunity in advocating lifestyle medicine because patients see them as a reliable source of advice. However, there are still numerous technical and logistic challenges to overcome before personal “big data” can be translated into actionable and clinically relevant solutions. Clinicians are also facing various issues prior to bringing personalized lifestyle medicine to their practice. Nevertheless, emerging ground-breaking research projects have given us a glimpse of how systems thinking and computational methods may lead to personalized health advice. It is important that all stakeholders work together to create the needed paradigm shift in healthcare before the rising epidemic of NCDs overwhelm the society, the economy, and the dated health system.

  2. Personalized translational epilepsy research - Novel approaches and future perspectives: Part I: Clinical and network analysis approaches.

    Science.gov (United States)

    Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc

  3. Let the Game Begin: Ergodic as an Approach for Video Game Translation

    Directory of Open Access Journals (Sweden)

    Sf. Lukfianka Sanjaya Purnama, Sf. Luthfie Arguby Purnomo, Dyah Nugrahani

    2017-01-01

    Full Text Available This paper attempts to propose ergodic as an approach for video game translation. The word approach here refers to an approach for translation products and to an approach for the translation process. The steps to formulate ergodic as an approach are first, Aarseth’sergodic literature is reviewed to elicit a basis for comprehension toward its relationship with video games and video game translation Secondly, taking the translation of Electronic Arts’Need for Speed: Own the City, Midway’s Mortal Kombat: Unchained, and Konami’s Metal Gear Solid, ergodic based approach for video game translation is formulated. The formulation signifies that ergodic, as an approach for video game translation, revolves around the treatment of video games as a cybertext from which scriptons, textons, and traversal functions as the configurative mechanism influence the selection of translation strategies and the transferability of variables and traversal function, game aesthetics, and ludus and narrative of the games. The challenges countered when treating video games as a cybertext are the necessities for the translators to convey anamorphosis, mechanical and narrative hidden meaning of the analyzed frame, to consider the textonomy of the games, and at the same time to concern on GILT (Globalization, Internationalization, Localization, and Translation.

  4. Geminivirus data warehouse: a database enriched with machine learning approaches.

    Science.gov (United States)

    Silva, Jose Cleydson F; Carvalho, Thales F M; Basso, Marcos F; Deguchi, Michihito; Pereira, Welison A; Sobrinho, Roberto R; Vidigal, Pedro M P; Brustolini, Otávio J B; Silva, Fabyano F; Dal-Bianco, Maximiller; Fontes, Renildes L F; Santos, Anésia A; Zerbini, Francisco Murilo; Cerqueira, Fabio R; Fontes, Elizabeth P B

    2017-05-05

    The Geminiviridae family encompasses a group of single-stranded DNA viruses with twinned and quasi-isometric virions, which infect a wide range of dicotyledonous and monocotyledonous plants and are responsible for significant economic losses worldwide. Geminiviruses are divided into nine genera, according to their insect vector, host range, genome organization, and phylogeny reconstruction. Using rolling-circle amplification approaches along with high-throughput sequencing technologies, thousands of full-length geminivirus and satellite genome sequences were amplified and have become available in public databases. As a consequence, many important challenges have emerged, namely, how to classify, store, and analyze massive datasets as well as how to extract information or new knowledge. Data mining approaches, mainly supported by machine learning (ML) techniques, are a natural means for high-throughput data analysis in the context of genomics, transcriptomics, proteomics, and metabolomics. Here, we describe the development of a data warehouse enriched with ML approaches, designated geminivirus.org. We implemented search modules, bioinformatics tools, and ML methods to retrieve high precision information, demarcate species, and create classifiers for genera and open reading frames (ORFs) of geminivirus genomes. The use of data mining techniques such as ETL (Extract, Transform, Load) to feed our database, as well as algorithms based on machine learning for knowledge extraction, allowed us to obtain a database with quality data and suitable tools for bioinformatics analysis. The Geminivirus Data Warehouse (geminivirus.org) offers a simple and user-friendly environment for information retrieval and knowledge discovery related to geminiviruses.

  5. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    Science.gov (United States)

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.

    2018-01-01

    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  6. A machine learning approach for the classification of metallic glasses

    Science.gov (United States)

    Gossett, Eric; Perim, Eric; Toher, Cormac; Lee, Dongwoo; Zhang, Haitao; Liu, Jingbei; Zhao, Shaofan; Schroers, Jan; Vlassak, Joost; Curtarolo, Stefano

    Metallic glasses possess an extensive set of mechanical properties along with plastic-like processability. As a result, they are a promising material in many industrial applications. However, the successful synthesis of novel metallic glasses requires trial and error, costing both time and resources. Therefore, we propose a high-throughput approach that combines an extensive set of experimental measurements with advanced machine learning techniques. This allows us to classify metallic glasses and predict the full phase diagrams for a given alloy system. Thus this method provides a means to identify potential glass-formers and opens up the possibility for accelerating and reducing the cost of the design of new metallic glasses.

  7. Towards a Participatory Approach to Bible Translation (PABT) 1

    African Journals Online (AJOL)

    It is generally acknowledged that the participation of the receptor community may enhance the community's ownership and acceptability of the translation. In spite of this acknowledgement, individuals and organisations engaged in mother tongue translations of the Bible often involve the members of the receptor community ...

  8. Gene prediction in metagenomic fragments: A large scale machine learning approach

    Directory of Open Access Journals (Sweden)

    Morgenstern Burkhard

    2008-04-01

    Full Text Available Abstract Background Metagenomics is an approach to the characterization of microbial genomes via the direct isolation of genomic sequences from the environment without prior cultivation. The amount of metagenomic sequence data is growing fast while computational methods for metagenome analysis are still in their infancy. In contrast to genomic sequences of single species, which can usually be assembled and analyzed by many available methods, a large proportion of metagenome data remains as unassembled anonymous sequencing reads. One of the aims of all metagenomic sequencing projects is the identification of novel genes. Short length, for example, Sanger sequencing yields on average 700 bp fragments, and unknown phylogenetic origin of most fragments require approaches to gene prediction that are different from the currently available methods for genomes of single species. In particular, the large size of metagenomic samples requires fast and accurate methods with small numbers of false positive predictions. Results We introduce a novel gene prediction algorithm for metagenomic fragments based on a two-stage machine learning approach. In the first stage, we use linear discriminants for monocodon usage, dicodon usage and translation initiation sites to extract features from DNA sequences. In the second stage, an artificial neural network combines these features with open reading frame length and fragment GC-content to compute the probability that this open reading frame encodes a protein. This probability is used for the classification and scoring of gene candidates. With large scale training, our method provides fast single fragment predictions with good sensitivity and specificity on artificially fragmented genomic DNA. Additionally, this method is able to predict translation initiation sites accurately and distinguishes complete from incomplete genes with high reliability. Conclusion Large scale machine learning methods are well-suited for gene

  9. A Simple and General Approach to Determination of Self and Mutual Inductances for AC machines

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Rasmussen, Peter Omand; Ritchie, Ewen

    2011-01-01

    Modelling of AC electrical machines plays an important role in electrical engineering education related to electrical machine design and control. One of the fundamental requirements in AC machine modelling is to derive the self and mutual inductances, which could be position dependant. Theories...... developed so far for inductance determination are often associated with complicated machine magnetic field analysis, which exhibits a difficulty for most students. This paper describes a simple and general approach to the determination of self and mutual inductances of different types of AC machines. A new...... determination are given for a 3-phase, salient-pole synchronous machine, and an induction machine....

  10. Reading Strategies in a L2: A Study on Machine Translation

    Science.gov (United States)

    Karnal, Adriana Riess; Pereira, Vera Vanmacher

    2015-01-01

    This article aims at understanding cognitive strategies which are involved in reading academic texts in English as a L2/FL. Specifically, we focus on reading comprehension when a text is read either using Google translator or not. From this perspective we must consider the reading process in its complexity not only as a decoding process. We follow…

  11. Predicting DPP-IV inhibitors with machine learning approaches

    Science.gov (United States)

    Cai, Jie; Li, Chanjuan; Liu, Zhihong; Du, Jiewen; Ye, Jiming; Gu, Qiong; Xu, Jun

    2017-04-01

    Dipeptidyl peptidase IV (DPP-IV) is a promising Type 2 diabetes mellitus (T2DM) drug target. DPP-IV inhibitors prolong the action of glucagon-like peptide-1 (GLP-1) and gastric inhibitory peptide (GIP), improve glucose homeostasis without weight gain, edema, and hypoglycemia. However, the marketed DPP-IV inhibitors have adverse effects such as nasopharyngitis, headache, nausea, hypersensitivity, skin reactions and pancreatitis. Therefore, it is still expected for novel DPP-IV inhibitors with minimal adverse effects. The scaffolds of existing DPP-IV inhibitors are structurally diversified. This makes it difficult to build virtual screening models based upon the known DPP-IV inhibitor libraries using conventional QSAR approaches. In this paper, we report a new strategy to predict DPP-IV inhibitors with machine learning approaches involving naïve Bayesian (NB) and recursive partitioning (RP) methods. We built 247 machine learning models based on 1307 known DPP-IV inhibitors with optimized molecular properties and topological fingerprints as descriptors. The overall predictive accuracies of the optimized models were greater than 80%. An external test set, composed of 65 recently reported compounds, was employed to validate the optimized models. The results demonstrated that both NB and RP models have a good predictive ability based on different combinations of descriptors. Twenty "good" and twenty "bad" structural fragments for DPP-IV inhibitors can also be derived from these models for inspiring the new DPP-IV inhibitor scaffold design.

  12. Use of machine learning approaches for novel drug discovery.

    Science.gov (United States)

    Lima, Angélica Nakagawa; Philot, Eric Allison; Trossini, Gustavo Henrique Goulart; Scott, Luis Paulo Barbour; Maltarollo, Vinícius Gonçalves; Honorio, Kathia Maria

    2016-01-01

    The use of computational tools in the early stages of drug development has increased in recent decades. Machine learning (ML) approaches have been of special interest, since they can be applied in several steps of the drug discovery methodology, such as prediction of target structure, prediction of biological activity of new ligands through model construction, discovery or optimization of hits, and construction of models that predict the pharmacokinetic and toxicological (ADMET) profile of compounds. This article presents an overview on some applications of ML techniques in drug design. These techniques can be employed in ligand-based drug design (LBDD) and structure-based drug design (SBDD) studies, such as similarity searches, construction of classification and/or prediction models of biological activity, prediction of secondary structures and binding sites docking and virtual screening. Successful cases have been reported in the literature, demonstrating the efficiency of ML techniques combined with traditional approaches to study medicinal chemistry problems. Some ML techniques used in drug design are: support vector machine, random forest, decision trees and artificial neural networks. Currently, an important application of ML techniques is related to the calculation of scoring functions used in docking and virtual screening assays from a consensus, combining traditional and ML techniques in order to improve the prediction of binding sites and docking solutions.

  13. Prediction of skin sensitization potency using machine learning approaches.

    Science.gov (United States)

    Zang, Qingda; Paris, Michael; Lehmann, David M; Bell, Shannon; Kleinstreuer, Nicole; Allen, David; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Strickland, Judy

    2017-07-01

    The replacement of animal use in testing for regulatory classification of skin sensitizers is a priority for US federal agencies that use data from such testing. Machine learning models that classify substances as sensitizers or non-sensitizers without using animal data have been developed and evaluated. Because some regulatory agencies require that sensitizers be further classified into potency categories, we developed statistical models to predict skin sensitization potency for murine local lymph node assay (LLNA) and human outcomes. Input variables for our models included six physicochemical properties and data from three non-animal test methods: direct peptide reactivity assay; human cell line activation test; and KeratinoSens™ assay. Models were built to predict three potency categories using four machine learning approaches and were validated using external test sets and leave-one-out cross-validation. A one-tiered strategy modeled all three categories of response together while a two-tiered strategy modeled sensitizer/non-sensitizer responses and then classified the sensitizers as strong or weak sensitizers. The two-tiered model using the support vector machine with all assay and physicochemical data inputs provided the best performance, yielding accuracy of 88% for prediction of LLNA outcomes (120 substances) and 81% for prediction of human test outcomes (87 substances). The best one-tiered model predicted LLNA outcomes with 78% accuracy and human outcomes with 75% accuracy. By comparison, the LLNA predicts human potency categories with 69% accuracy (60 of 87 substances correctly categorized). These results suggest that computational models using non-animal methods may provide valuable information for assessing skin sensitization potency. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. CULTURAL TRANSFER IN TRAVEL GUIDE TRANSLATION: DISCOURSE APPROACH

    Directory of Open Access Journals (Sweden)

    Novikova Elina Yuryevna

    2014-09-01

    Full Text Available Intercultural communication and dialogue between various social and political structures and their globalized conditions immediately lead to the development of tourism and services market in this area, including translation services. The study of linguocultural characteristics of a travel guide in terms of pragmatically adequate translation is an interesting aspect for the analysis of the development and functioning of logics of modern interaction planes because the mass tourism participants' communicative characteristics are determined, on the one hand, by the universal, global, economic, social and cultural programmes of mass tourism and, on the other hand, by the local and national peculiarities of tourism discourse in general. The choice of linguistic means in travel guides is determined by their communicative and pragmatic as well as ethno-cultural characteristics that form the main discourse oriented translation programme. The translation of the travel guide texts to German supposes significant differences at out- and in-text levels to achieve maximum compliance with the potential recipients' expectations. The analysis of the two translations of the Russian-language travel guide made to German by the native German speaker and the non-native German speaker let define the so-called sharp edges in the cultural transfer of the information important for the discourse. The travel guide is characterized by the specific features of the touristics discourse, on the one hand, and by the interesting experience of translating, on the other hand.

  15. Quantifying complexity in translational research: an integrated approach.

    Science.gov (United States)

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  16. MT-ComparEval: Graphical evaluation interface for Machine Translation development

    Directory of Open Access Journals (Sweden)

    Klejch Ondřej

    2015-10-01

    Full Text Available The tool described in this article has been designed to help MT developers by implementing a web-based graphical user interface that allows to systematically compare and evaluate various MT engines/experiments using comparative analysis via automatic measures and statistics. The evaluation panel provides graphs, tests for statistical significance and n-gram statistics. We also present a demo server http://wmt.ufal.cz with WMT14 and WMT15 translations.

  17. A support vector machine approach for detection of microcalcifications.

    Science.gov (United States)

    El-Naqa, Issam; Yang, Yongyi; Wernick, Miles N; Galatsanos, Nikolas P; Nishikawa, Robert M

    2002-12-01

    In this paper, we investigate an approach based on support vector machines (SVMs) for detection of microcalcification (MC) clusters in digital mammograms, and propose a successive enhancement learning scheme for improved performance. SVM is a machine-learning method, based on the principle of structural risk minimization, which performs well when applied to data outside the training set. We formulate MC detection as a supervised-learning problem and apply SVM to develop the detection algorithm. We use the SVM to detect at each location in the image whether an MC is present or not. We tested the proposed method using a database of 76 clinical mammograms containing 1120 MCs. We use free-response receiver operating characteristic curves to evaluate detection performance, and compare the proposed algorithm with several existing methods. In our experiments, the proposed SVM framework outperformed all the other methods tested. In particular, a sensitivity as high as 94% was achieved by the SVM method at an error rate of one false-positive cluster per image. The ability of SVM to out perform several well-known methods developed for the widely studied problem of MC detection suggests that SVM is a promising technique for object detection in a medical imaging application.

  18. A Cooperative Approach to Virtual Machine Based Fault Injection

    Energy Technology Data Exchange (ETDEWEB)

    Naughton III, Thomas J [ORNL; Engelmann, Christian [ORNL; Vallee, Geoffroy R [ORNL; Aderholdt, William Ferrol [ORNL; Scott, Stephen L [Tennessee Technological University (TTU)

    2017-01-01

    Resilience investigations often employ fault injection (FI) tools to study the effects of simulated errors on a target system. It is important to keep the target system under test (SUT) isolated from the controlling environment in order to maintain control of the experiement. Virtual machines (VMs) have been used to aid these investigations due to the strong isolation properties of system-level virtualization. A key challenge in fault injection tools is to gain proper insight and context about the SUT. In VM-based FI tools, this challenge of target con- text is increased due to the separation between host and guest (VM). We discuss an approach to VM-based FI that leverages virtual machine introspection (VMI) methods to gain insight into the target s context running within the VM. The key to this environment is the ability to provide basic information to the FI system that can be used to create a map of the target environment. We describe a proof- of-concept implementation and a demonstration of its use to introduce simulated soft errors into an iterative solver benchmark running in user-space of a guest VM.

  19. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  20. Extracting meaning from audio signals - a machine learning approach

    DEFF Research Database (Denmark)

    Larsen, Jan

    2007-01-01

    * Machine learning framework for sound search * Genre classification * Music and audio separation * Wind noise suppression......* Machine learning framework for sound search * Genre classification * Music and audio separation * Wind noise suppression...

  1. COMPARISION OF FUZZY PERT APPROACHES IN MACHINE PRODUCTION PROCESS

    Directory of Open Access Journals (Sweden)

    İRFAN ERTUĞRUL

    2013-06-01

    Full Text Available In traditional PERT (Program Evaluation and Review Technique activity durations are represented as crisp numbers and assumed that they are drawn from beta distribution. However, in real life the duration of the activities are usually difficult to estimate precisely.  In order to overcome this difficulty, there are studies in the literature that combine fuzzy set theory and PERT method. In this study, two fuzzy PERT approaches proposed by different authors are employed to find the degrees of criticality of each path in the network and comparison of these two methods is also given. Furthermore, by the help of these methods the criticality of the activities in the marble machine production process of a company that manufactures machinery is determined and results are compared.

  2. Machine Learning Approaches to Increasing Value of Spaceflight Omics Databases

    Science.gov (United States)

    Gentry, Diana

    2017-01-01

    The number of spaceflight bioscience mission opportunities is too small to allow all relevant biological and environmental parameters to be experimentally identified. Simulated spaceflight experiments in ground-based facilities (GBFs), such as clinostats, are each suitable only for particular investigations -- a rotating-wall vessel may be 'simulated microgravity' for cell differentiation (hours), but not DNA repair (seconds) -- and introduce confounding stimuli, such as motor vibration and fluid shear effects. This uncertainty over which biological mechanisms respond to a given form of simulated space radiation or gravity, as well as its side effects, limits our ability to baseline spaceflight data and validate mission science. Machine learning techniques autonomously identify relevant and interdependent factors in a data set given the set of desired metrics to be evaluated: to automatically identify related studies, compare data from related studies, or determine linkages between types of data in the same study. System-of-systems (SoS) machine learning models have the ability to deal with both sparse and heterogeneous data, such as that provided by the small and diverse number of space biosciences flight missions; however, they require appropriate user-defined metrics for any given data set. Although machine learning in bioinformatics is rapidly expanding, the need to combine spaceflight/GBF mission parameters with omics data is unique. This work characterizes the basic requirements for implementing the SoS approach through the System Map (SM) technique, a composite of a dynamic Bayesian network and Gaussian mixture model, in real-world repositories such as the GeneLab Data System and Life Sciences Data Archive. The three primary steps are metadata management for experimental description using open-source ontologies, defining similarity and consistency metrics, and generating testing and validation data sets. Such approaches to spaceflight and GBF omics data may

  3. Thomas Mofolo's sentence design in Chaka approached in translation

    African Journals Online (AJOL)

    considered, his main focus had been on the leaking of the information about Nandi's impregnation by ..... Folgado, V. L. “Literary Translation as a Cognitive Activity.” Aspects of ... oSAC&printsec=frontcover=onepage&q&f=false>. Segoete, E.

  4. The Temple Translator's Workstation Project

    National Research Council Canada - National Science Library

    Vanni, Michelle; Zajac, Remi

    1996-01-01

    .... The Temple Translator's Workstation is incorporated into a Tipster document management architecture and it allows both translator/analysts and monolingual analysts to use the machine- translation...

  5. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  6. On feature augmentation for semantic argument classification of the Quran English translation using support vector machine

    Science.gov (United States)

    Khaira Batubara, Dina; Arif Bijaksana, Moch; Adiwijaya

    2018-03-01

    Research on the semantic argument classification requires semantically labeled data in large numbers, called corpus. Because building a corpus is costly and time-consuming, recently many studies have used existing corpus as the training data to conduct semantic argument classification research on new domain. But previous studies have proven that there is a significant decrease in performance when classifying semantic arguments on different domain between the training and the testing data. The main problem is when there is a new argument that found in the testing data but it is not found in the training data. This research carries on semantic argument classification on a new domain that is Quran English Translation by utilizing Propbank corpus as the training data. To recognize the new argument in the training data, this research proposes four new features for extending the argument features in the training data. By using SVM Linear, the experiment has proven that augmenting the proposed features to the baseline system with some combinations option improve the performance of semantic argument classification on Quran data using Propbank Corpus as training data.

  7. The development of a classification schema for arts-based approaches to knowledge translation.

    Science.gov (United States)

    Archibald, Mandy M; Caine, Vera; Scott, Shannon D

    2014-10-01

    Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.

  8. PRISMA database machine: A distributed, main-memory approach

    NARCIS (Netherlands)

    Schmidt, J.W.; Apers, Peter M.G.; Ceri, S.; Kersten, Martin L.; Oerlemans, Hans C.M.; Missikoff, M.

    1988-01-01

    The PRISMA project is a large-scale research effort in the design and implementation of a highly parallel machine for data and knowledge processing. The PRISMA database machine is a distributed, main-memory database management system implemented in an object-oriented language that runs on top of a

  9. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    Science.gov (United States)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention

  10. A Formal Model of Ambiguity and its Applications in Machine Translation

    Science.gov (United States)

    2010-01-01

    structure indicates linguisti- cally implausible segmentation that might be generated using dictionary - driven approaches...derivation. As was done in the monolingual case, the functions LHS, RHSi, RHSo and υ can be extended to a derivation δ. D(q) where q ∈V denotes the... monolingual parses. My algorithm runs more efficiently than O(n6) with many grammars (including those that required using heuristic search with other parsers

  11. Exploration of Machine Learning Approaches to Predict Pavement Performance

    Science.gov (United States)

    2018-03-23

    Machine learning (ML) techniques were used to model and predict pavement condition index (PCI) for various pavement types using a variety of input variables. The primary objective of this research was to develop and assess PCI predictive models for t...

  12. Machine learning for adaptive many-core machines a practical approach

    CERN Document Server

    Lopes, Noel

    2015-01-01

    The overwhelming data produced everyday and the increasing performance and cost requirements of applications?are transversal to a wide range of activities in society, from science to industry. In particular, the magnitude and complexity of the tasks that Machine Learning (ML) algorithms have to solve are driving the need to devise adaptive many-core machines that scale well with the volume of data, or in other words, can handle Big Data.This book gives a concise view on how to extend the applicability of well-known ML algorithms in Graphics Processing Unit (GPU) with data scalability in mind.

  13. Dual Numbers Approach in Multiaxis Machines Error Modeling

    Directory of Open Access Journals (Sweden)

    Jaroslav Hrdina

    2014-01-01

    Full Text Available Multiaxis machines error modeling is set in the context of modern differential geometry and linear algebra. We apply special classes of matrices over dual numbers and propose a generalization of such concept by means of general Weil algebras. We show that the classification of the geometric errors follows directly from the algebraic properties of the matrices over dual numbers and thus the calculus over the dual numbers is the proper tool for the methodology of multiaxis machines error modeling.

  14. Crack identification for rotating machines based on a nonlinear approach

    Science.gov (United States)

    Cavalini, A. A., Jr.; Sanches, L.; Bachschmid, N.; Steffen, V., Jr.

    2016-10-01

    In a previous contribution, a crack identification methodology based on a nonlinear approach was proposed. The technique uses external applied diagnostic forces at certain frequencies attaining combinational resonances, together with a pseudo-random optimization code, known as Differential Evolution, in order to characterize the signatures of the crack in the spectral responses of the flexible rotor. The conditions under which combinational resonances appear were determined by using the method of multiple scales. In real conditions, the breathing phenomenon arises from the stress and strain distribution on the cross-sectional area of the crack. This mechanism behavior follows the static and dynamic loads acting on the rotor. Therefore, the breathing crack can be simulated according to the Mayes' model, in which the crack transition from fully opened to fully closed is described by a cosine function. However, many contributions try to represent the crack behavior by machining a small notch on the shaft instead of the fatigue process. In this paper, the open and breathing crack models are compared regarding their dynamic behavior and the efficiency of the proposed identification technique. The additional flexibility introduced by the crack is calculated by using the linear fracture mechanics theory (LFM). The open crack model is based on LFM and the breathing crack model corresponds to the Mayes' model, which combines LFM with a given breathing mechanism. For illustration purposes, a rotor composed by a horizontal flexible shaft, two rigid discs, and two self-aligning ball bearings is used to compose a finite element model of the system. Then, numerical simulation is performed to determine the dynamic behavior of the rotor. Finally, the results of the inverse problem conveyed show that the methodology is a reliable tool that is able to estimate satisfactorily the location and depth of the crack.

  15. Detecting false positive sequence homology: a machine learning approach.

    Science.gov (United States)

    Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M

    2016-02-24

    Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.

  16. Sustainable malaria control: transdisciplinary approaches for translational applications

    Science.gov (United States)

    2012-01-01

    With the adoption of the Global Malaria Action Plan, several countries are moving from malaria control towards elimination and eradication. However, the sustainability of some of the approaches taken may be questionable. Here, an overview of malaria control and elimination strategies is provided and the sustainability of each in context of vector- and parasite control is assessed. From this, it can be concluded that transdisciplinary approaches are essential for sustained malaria control and elimination in malaria-endemic communities. PMID:23268712

  17. Sustainable malaria control: transdisciplinary approaches for translational applications

    Directory of Open Access Journals (Sweden)

    Birkholtz Lyn-Marie

    2012-12-01

    Full Text Available Abstract With the adoption of the Global Malaria Action Plan, several countries are moving from malaria control towards elimination and eradication. However, the sustainability of some of the approaches taken may be questionable. Here, an overview of malaria control and elimination strategies is provided and the sustainability of each in context of vector- and parasite control is assessed. From this, it can be concluded that transdisciplinary approaches are essential for sustained malaria control and elimination in malaria-endemic communities.

  18. A machine learning approach to understand business processes

    NARCIS (Netherlands)

    Maruster, L.

    2003-01-01

    Business processes (industries, administration, hospitals, etc.) become nowadays more and more complex and it is difficult to have a complete understanding of them. The goal of the thesis is to show that machine learning techniques can be used successfully for understanding a process on the basis of

  19. New approach for virtual machines consolidation in heterogeneous computing systems

    Czech Academy of Sciences Publication Activity Database

    Fesl, Jan; Cehák, J.; Doležalová, Marie; Janeček, J.

    2016-01-01

    Roč. 9, č. 12 (2016), s. 321-332 ISSN 1738-9968 Institutional support: RVO:60077344 Keywords : consolidation * virtual machine * distributed Subject RIV: JD - Computer Applications, Robotics http://www.sersc.org/journals/IJHIT/vol9_no12_2016/29.pdf

  20. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  1. Metabolomics, a promising approach to translational research in cardiology

    Directory of Open Access Journals (Sweden)

    Martino Deidda

    2015-12-01

    In this article, we will provide a description of metabolomics in comparison with other, better known “omics” disciplines such as genomics and proteomics. In addition, we will review the current rationale for the implementation of metabolomics in cardiology, its basic methodology and the available data from human studies in this discipline. The topics covered will delineate the importance of being able to use the metabolomic information to understand the mechanisms of diseases from the perspective of systems biology, and as a non-invasive approach to the diagnosis, grading and treatment of cardiovascular diseases.

  2. Compositional translation

    NARCIS (Netherlands)

    Appelo, Lisette; Janssen, Theo; Jong, de F.M.G.; Landsbergen, S.P.J.

    1994-01-01

    This book provides an in-depth review of machine translation by discussing in detail a particular method, called compositional translation, and a particular system, Rosetta, which is based on this method. The Rosetta project is a unique combination of fundamental research and large-scale

  3. Asynchronous machine rotor speed estimation using a tabulated numerical approach

    Science.gov (United States)

    Nguyen, Huu Phuc; De Miras, Jérôme; Charara, Ali; Eltabach, Mario; Bonnet, Stéphane

    2017-12-01

    This paper proposes a new method to estimate the rotor speed of the asynchronous machine by looking at the estimation problem as a nonlinear optimal control problem. The behavior of the nonlinear plant model is approximated off-line as a prediction map using a numerical one-step time discretization obtained from simulations. At each time-step, the speed of the induction machine is selected satisfying the dynamic fitting problem between the plant output and the predicted output, leading the system to adopt its dynamical behavior. Thanks to the limitation of the prediction horizon to a single time-step, the execution time of the algorithm can be completely bounded. It can thus easily be implemented and embedded into a real-time system to observe the speed of the real induction motor. Simulation results show the performance and robustness of the proposed estimator.

  4. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  5. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting

    OpenAIRE

    Shi, Xingjian; Chen, Zhourong; Wang, Hao; Yeung, Dit-Yan; Wong, Wai-kin; Woo, Wang-chun

    2015-01-01

    The goal of precipitation nowcasting is to predict the future rainfall intensity in a local region over a relatively short period of time. Very few previous studies have examined this crucial and challenging weather forecasting problem from the machine learning perspective. In this paper, we formulate precipitation nowcasting as a spatiotemporal sequence forecasting problem in which both the input and the prediction target are spatiotemporal sequences. By extending the fully connected LSTM (F...

  6. Machine learning approach for single molecule localisation microscopy.

    Science.gov (United States)

    Colabrese, Silvia; Castello, Marco; Vicidomini, Giuseppe; Del Bue, Alessio

    2018-04-01

    Single molecule localisation (SML) microscopy is a fundamental tool for biological discoveries; it provides sub-diffraction spatial resolution images by detecting and localizing "all" the fluorescent molecules labeling the structure of interest. For this reason, the effective resolution of SML microscopy strictly depends on the algorithm used to detect and localize the single molecules from the series of microscopy frames. To adapt to the different imaging conditions that can occur in a SML experiment, all current localisation algorithms request, from the microscopy users, the choice of different parameters. This choice is not always easy and their wrong selection can lead to poor performance. Here we overcome this weakness with the use of machine learning. We propose a parameter-free pipeline for SML learning based on support vector machine (SVM). This strategy requires a short supervised training that consists in selecting by the user few fluorescent molecules (∼ 10-20) from the frames under analysis. The algorithm has been extensively tested on both synthetic and real acquisitions. Results are qualitatively and quantitatively consistent with the state of the art in SML microscopy and demonstrate that the introduction of machine learning can lead to a new class of algorithms competitive and conceived from the user point of view.

  7. The Maximum Cross-Correlation approach to detecting translational motions from sequential remote-sensing images

    Science.gov (United States)

    Gao, J.; Lythe, M. B.

    1996-06-01

    This paper presents the principle of the Maximum Cross-Correlation (MCC) approach in detecting translational motions within dynamic fields from time-sequential remotely sensed images. A C program implementing the approach is presented and illustrated in a flowchart. The program is tested with a pair of sea-surface temperature images derived from Advanced Very High Resolution Radiometer (AVHRR) images near East Cape, New Zealand. Results show that the mean currents in the region have been detected satisfactorily with the approach.

  8. A Machine Learning Approach to Test Data Generation

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahmcke, Christina Mackeprang

    2007-01-01

    been tested, and a more thorough statistical foundation is required. We propose to use logic-statistical modelling methods for machine-learning for analyzing existing and manually marked up data, integrated with the generation of new, artificial data. More specifically, we suggest to use the PRISM...... system developed by Sato and Kameya. Based on logic programming extended with random variables and parameter learning, PRISM appears as a powerful modelling environment, which subsumes HMMs and a wide range of other methods, all embedded in a declarative language. We illustrate these principles here...

  9. Structural transformation of the machine-building enterprises: management approaches

    OpenAIRE

    Obydiennova, T.

    2014-01-01

    The article considers the «classical» approach to the issues of enterprise management: process, system, situational, concrete, functional, resource, factorial approach. Outlines the definitions of each of the approaches to the management of the enterprise, as well as advantages and disadvantages. The author suggests the schematic image of the aspects of the system approach and its application; a scheme of contents resource and functional approaches. For research on the structural transformati...

  10. Translation Theory 'Translated'

    DEFF Research Database (Denmark)

    Wæraas, Arild; Nielsen, Jeppe

    2016-01-01

    Translation theory has proved to be a versatile analytical lens used by scholars working from different traditions. On the basis of a systematic literature review, this study adds to our understanding of the ‘translations’ of translation theory by identifying the distinguishing features of the most...... common theoretical approaches to translation within the organization and management discipline: actor-network theory, knowledge-based theory, and Scandinavian institutionalism. Although each of these approaches already has borne much fruit in research, the literature is diverse and somewhat fragmented......, but also overlapping. We discuss the ways in which the three versions of translation theory may be combined and enrich each other so as to inform future research, thereby offering a more complete understanding of translation in and across organizational settings....

  11. Personalized Physical Activity Coaching: A Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Talko B. Dijkhuis

    2018-02-01

    Full Text Available Living a sedentary lifestyle is one of the major causes of numerous health problems. To encourage employees to lead a less sedentary life, the Hanze University started a health promotion program. One of the interventions in the program was the use of an activity tracker to record participants' daily step count. The daily step count served as input for a fortnightly coaching session. In this paper, we investigate the possibility of automating part of the coaching procedure on physical activity by providing personalized feedback throughout the day on a participant's progress in achieving a personal step goal. The gathered step count data was used to train eight different machine learning algorithms to make hourly estimations of the probability of achieving a personalized, daily steps threshold. In 80% of the individual cases, the Random Forest algorithm was the best performing algorithm (mean accuracy = 0.93, range = 0.88–0.99, and mean F1-score = 0.90, range = 0.87–0.94. To demonstrate the practical usefulness of these models, we developed a proof-of-concept Web application that provides personalized feedback about whether a participant is expected to reach his or her daily threshold. We argue that the use of machine learning could become an invaluable asset in the process of automated personalized coaching. The individualized algorithms allow for predicting physical activity during the day and provides the possibility to intervene in time.

  12. Effective and efficient optics inspection approach using machine learning algorithms

    International Nuclear Information System (INIS)

    Abdulla, G.; Kegelmeyer, L.; Liao, Z.; Carr, W.

    2010-01-01

    The Final Optics Damage Inspection (FODI) system automatically acquires and utilizes the Optics Inspection (OI) system to analyze images of the final optics at the National Ignition Facility (NIF). During each inspection cycle up to 1000 images acquired by FODI are examined by OI to identify and track damage sites on the optics. The process of tracking growing damage sites on the surface of an optic can be made more effective by identifying and removing signals associated with debris or reflections. The manual process to filter these false sites is daunting and time consuming. In this paper we discuss the use of machine learning tools and data mining techniques to help with this task. We describe the process to prepare a data set that can be used for training and identifying hardware reflections in the image data. In order to collect training data, the images are first automatically acquired and analyzed with existing software and then relevant features such as spatial, physical and luminosity measures are extracted for each site. A subset of these sites is 'truthed' or manually assigned a class to create training data. A supervised classification algorithm is used to test if the features can predict the class membership of new sites. A suite of self-configuring machine learning tools called 'Avatar Tools' is applied to classify all sites. To verify, we used 10-fold cross correlation and found the accuracy was above 99%. This substantially reduces the number of false alarms that would otherwise be sent for more extensive investigation.

  13. Process Approach for Modeling of Machine and Tractor Fleet Structure

    Science.gov (United States)

    Dokin, B. D.; Aletdinova, A. A.; Kravchenko, M. S.; Tsybina, Y. S.

    2018-05-01

    The existing software complexes on modelling of the machine and tractor fleet structure are mostly aimed at solving the task of optimization. However, the creators, choosing only one optimization criterion and incorporating it in their software, provide grounds on why it is the best without giving a decision maker the opportunity to choose it for their enterprise. To analyze “bottlenecks” of machine and tractor fleet modelling, the authors of this article created a process model, in which they included adjustment to the plan of using machinery based on searching through alternative technologies. As a result, the following recommendations for software complex development have been worked out: the introduction of a database of alternative technologies; the possibility for a user to change the timing of the operations even beyond the allowable limits and in that case the calculation of the incurred loss; the possibility to rule out the solution of an optimization task, and if there is a necessity in it - the possibility to choose an optimization criterion; introducing graphical display of an annual complex of works, which could be enough for the development and adjustment of a business strategy.

  14. National Heart, Lung, and Blood Institute and the translation of cardiovascular discoveries into therapeutic approaches.

    Science.gov (United States)

    Galis, Zorina S; Black, Jodi B; Skarlatos, Sonia I

    2013-04-26

    The molecular causes of ≈4000 medical conditions have been described, yet only 5% have associated therapies. For decades, the average time for drug development through approval has taken 10 to 20 years. In recent years, the serious challenges that confront the private sector have made it difficult to capitalize on new opportunities presented by advances in genomics and cellular therapies. Current trends are disturbing. Pharmaceutical companies are reducing their investments in research, and biotechnology companies are struggling to obtain venture funds. To support early-stage translation of the discoveries in basic science, the National Institutes of Health and the National Heart, Lung, and Blood Institute have developed new approaches to facilitating the translation of basic discoveries into clinical applications and will continue to develop a variety of programs that create teams of academic investigators and industry partners. The goal of these programs is to maximize the public benefit of investment of taxpayer dollars in biomedical research and to lessen the risk required for industry partners to make substantial investments. This article highlights several examples of National Heart, Lung, and Blood Institute-initiated translational programs and National Institutes of Health translational resources designed to catalyze and enable the earliest stages of the biomedical product development process. The translation of latest discoveries into therapeutic approaches depends on continued federal funding to enhance the early stages of the product development process and to stimulate and catalyze partnerships between academia, industry, and other sources of capital.

  15. Measurements of translation, rotation and strain: new approaches to seismic processing and inversion

    NARCIS (Netherlands)

    Bernauer, M.; Fichtner, A.; Igel, H.

    2012-01-01

    We propose a novel approach to seismic tomography based on the joint processing of translation, strain and rotation measurements. Our concept is based on the apparent S and P velocities, defined as the ratios of displacement velocity and rotation amplitude, and displacement velocity and

  16. A practical approach for translating climate change adaptation principles into forest management actions

    Science.gov (United States)

    Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; P. Danielle Shannon; Louis R. Iverson; Stephen N. Matthews; Anantha Prasad; Matthew P. Peters

    2014-01-01

    There is an ever-growing body of literature on forest management strategies for climate change adaptation; however, few frameworks have been presented for integrating these strategies with the real-world challenges of forest management. We have developed a structured approach for translating broad adaptation concepts into specific management actions and silvicultural...

  17. Analysis of the whole mitochondrial genome: translation of the Ion Torrent Personal Genome Machine system to the diagnostic bench?

    Science.gov (United States)

    Seneca, Sara; Vancampenhout, Kim; Van Coster, Rudy; Smet, Joél; Lissens, Willy; Vanlander, Arnaud; De Paepe, Boel; Jonckheere, An; Stouffs, Katrien; De Meirleir, Linda

    2015-01-01

    Next-generation sequencing (NGS), an innovative sequencing technology that enables the successful analysis of numerous gene sequences in a massive parallel sequencing approach, has revolutionized the field of molecular biology. Although NGS was introduced in a rather recent past, the technology has already demonstrated its potential and effectiveness in many research projects, and is now on the verge of being introduced into the diagnostic setting of routine laboratories to delineate the molecular basis of genetic disease in undiagnosed patient samples. We tested a benchtop device on retrospective genomic DNA (gDNA) samples of controls and patients with a clinical suspicion of a mitochondrial DNA disorder. This Ion Torrent Personal Genome Machine platform is a high-throughput sequencer with a fast turnaround time and reasonable running costs. We challenged the chemistry and technology with the analysis and processing of a mutational spectrum composed of samples with single-nucleotide substitutions, indels (insertions and deletions) and large single or multiple deletions, occasionally in heteroplasmy. The output data were compared with previously obtained conventional dideoxy sequencing results and the mitochondrial revised Cambridge Reference Sequence (rCRS). We were able to identify the majority of all nucleotide alterations, but three false-negative results were also encountered in the data set. At the same time, the poor performance of the PGM instrument in regions associated with homopolymeric stretches generated many false-positive miscalls demanding additional manual curation of the data.

  18. A General and Intuitive Approach to Understand and Compare the Torque Production Capability of AC Machines

    DEFF Research Database (Denmark)

    Wang, Dong; Lu, Kaiyuan; Rasmussen, Peter Omand

    2014-01-01

    Electromagnetic torque analysis is one of the key issues in the analysis of electric machines. It plays an important role in machine design and control. The common method described in most of the textbooks is to calculate the torque in the machine variables and then transform them to the dq......-frame, through complicated mathematical manipulations. This is a more mathematical approach rather than explaining the physics behind torque production, which even brings a lot of difficulties to specialist. This paper introduces a general and intuitive approach to obtain the dq-frame torque equation of various...... AC machines. In this method, torque equation can be obtained based on the intuitive physical understanding of the mechanism behind torque production. It is then approved to be applicable for general case, including rotor saliency and various types of magnetomotive force sources. As an application...

  19. Application of Machine Learning Approaches for Protein-protein Interactions Prediction.

    Science.gov (United States)

    Zhang, Mengying; Su, Qiang; Lu, Yi; Zhao, Manman; Niu, Bing

    2017-01-01

    Proteomics endeavors to study the structures, functions and interactions of proteins. Information of the protein-protein interactions (PPIs) helps to improve our knowledge of the functions and the 3D structures of proteins. Thus determining the PPIs is essential for the study of the proteomics. In this review, in order to study the application of machine learning in predicting PPI, some machine learning approaches such as support vector machine (SVM), artificial neural networks (ANNs) and random forest (RF) were selected, and the examples of its applications in PPIs were listed. SVM and RF are two commonly used methods. Nowadays, more researchers predict PPIs by combining more than two methods. This review presents the application of machine learning approaches in predicting PPI. Many examples of success in identification and prediction in the area of PPI prediction have been discussed, and the PPIs research is still in progress. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Rule based systems for big data a machine learning approach

    CERN Document Server

    Liu, Han; Cocea, Mihaela

    2016-01-01

    The ideas introduced in this book explore the relationships among rule based systems, machine learning and big data. Rule based systems are seen as a special type of expert systems, which can be built by using expert knowledge or learning from real data. The book focuses on the development and evaluation of rule based systems in terms of accuracy, efficiency and interpretability. In particular, a unified framework for building rule based systems, which consists of the operations of rule generation, rule simplification and rule representation, is presented. Each of these operations is detailed using specific methods or techniques. In addition, this book also presents some ensemble learning frameworks for building ensemble rule based systems.

  1. Indonesian name matching using machine learning supervised approach

    Science.gov (United States)

    Alifikri, Mohamad; Arif Bijaksana, Moch.

    2018-03-01

    Most existing name matching methods are developed for English language and so they cover the characteristics of this language. Up to this moment, there is no specific one has been designed and implemented for Indonesian names. The purpose of this thesis is to develop Indonesian name matching dataset as a contribution to academic research and to propose suitable feature set by utilizing combination of context of name strings and its permute-winkler score. Machine learning classification algorithms is taken as the method for performing name matching. Based on the experiments, by using tuned Random Forest algorithm and proposed features, there is an improvement of matching performance by approximately 1.7% and it is able to reduce until 70% misclassification result of the state of the arts methods. This improving performance makes the matching system more effective and reduces the risk of misclassified matches.

  2. Elucidating Host-Pathogen Interactions Based on Post-Translational Modifications Using Proteomics Approaches

    DEFF Research Database (Denmark)

    Ravikumar, Vaishnavi; Jers, Carsten; Mijakovic, Ivan

    2015-01-01

    can be efficiently applied to gain an insight into the molecular mechanisms involved. The measurement of the proteome and post-translationally modified proteome dynamics using mass spectrometry, results in a wide array of information, such as significant changes in protein expression, protein...... display host specificity through a complex network of molecular interactions that aid their survival and propagation. Co-infection states further lead to complications by increasing the microbial burden and risk factors. Quantitative proteomics based approaches and post-translational modification analysis...... pathogen interactions....

  3. A machine learning approach to quantifying noise in medical images

    Science.gov (United States)

    Chowdhury, Aritra; Sevinsky, Christopher J.; Yener, Bülent; Aggour, Kareem S.; Gustafson, Steven M.

    2016-03-01

    As advances in medical imaging technology are resulting in significant growth of biomedical image data, new techniques are needed to automate the process of identifying images of low quality. Automation is needed because it is very time consuming for a domain expert such as a medical practitioner or a biologist to manually separate good images from bad ones. While there are plenty of de-noising algorithms in the literature, their focus is on designing filters which are necessary but not sufficient for determining how useful an image is to a domain expert. Thus a computational tool is needed to assign a score to each image based on its perceived quality. In this paper, we introduce a machine learning-based score and call it the Quality of Image (QoI) score. The QoI score is computed by combining the confidence values of two popular classification techniques—support vector machines (SVMs) and Naïve Bayes classifiers. We test our technique on clinical image data obtained from cancerous tissue samples. We used 747 tissue samples that are stained by four different markers (abbreviated as CK15, pck26, E_cad and Vimentin) leading to a total of 2,988 images. The results show that images can be classified as good (high QoI), bad (low QoI) or ugly (intermediate QoI) based on their QoI scores. Our automated labeling is in agreement with the domain experts with a bi-modal classification accuracy of 94%, on average. Furthermore, ugly images can be recovered and forwarded for further post-processing.

  4. A knowledge translation project on community-centred approaches in public health.

    Science.gov (United States)

    Stansfield, J; South, J

    2018-03-01

    This article examines the development and impact of a national knowledge translation project aimed at improving access to evidence and learning on community-centred approaches for health and wellbeing. Structural changes in the English health system meant that knowledge on community engagement was becoming lost and a fragmented evidence base was seen to impact negatively on policy and practice. A partnership started between Public Health England, NHS England and Leeds Beckett University in 2014 to address these issues. Following a literature review and stakeholder consultation, evidence was published in a national guide to community-centred approaches. This was followed by a programme of work to translate the evidence into national strategy and local practice.The article outlines the key features of the knowledge translation framework developed. Results include positive impacts on local practice and national policy, for example adoption within National Institute for Health and Care Evidence (NICE) guidance and Local Authority public health plans and utilization as a tool for local audit of practice and commissioning. The framework was successful in its non-linear approach to knowledge translation across a range of inter-connected activity, built on national leadership, knowledge brokerage, coalition building and a strong collaboration between research institute and government agency.

  5. Theory of transformation groups I general properties of continuous transformation groups a contemporary approach and translation

    CERN Document Server

    2015-01-01

    This modern translation of Sophus Lie's and Friedrich Engel's “Theorie der Transformationsgruppen Band I” will allow readers to discover the striking conceptual clarity and remarkably systematic organizational thought of the original German text. Volume I presents a comprehensive introduction to the theory and is mainly directed towards the generalization of ideas drawn from the study of examples. The major part of the present volume offers an extremely clear translation of the lucid original. The first four chapters provide not only a translation, but also a contemporary approach, which will help present day readers to familiarize themselves with the concepts at the heart of the subject. The editor's main objective was to encourage a renewed interest in the detailed classification of Lie algebras in dimensions 1, 2 and 3, and to offer access to Sophus Lie's monumental Galois theory of continuous transformation groups, established at the end of the 19th Century. Lie groups are widespread in mathematics, p...

  6. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  7. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    Science.gov (United States)

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Prediction of Skin Sensitization Potency Using Machine Learning Approaches

    Science.gov (United States)

    Replacing animal tests currently used for regulatory hazard classification of skin sensitizers is one of ICCVAM’s top priorities. Accordingly, U.S. federal agency scientists are developing and evaluating computational approaches to classify substances as sensitizers or nons...

  9. Predicting Refractive Surgery Outcome: Machine Learning Approach With Big Data.

    Science.gov (United States)

    Achiron, Asaf; Gur, Zvi; Aviv, Uri; Hilely, Assaf; Mimouni, Michael; Karmona, Lily; Rokach, Lior; Kaiserman, Igor

    2017-09-01

    To develop a decision forest for prediction of laser refractive surgery outcome. Data from consecutive cases of patients who underwent LASIK or photorefractive surgeries during a 12-year period in a single center were assembled into a single dataset. Training of machine-learning classifiers and testing were performed with a statistical classifier algorithm. The decision forest was created by feature vectors extracted from 17,592 cases and 38 clinical parameters for each patient. A 10-fold cross-validation procedure was applied to estimate the predictive value of the decision forest when applied to new patients. Analysis included patients younger than 40 years who were not treated for monovision. Efficacy of 0.7 or greater and 0.8 or greater was achieved in 16,198 (92.0%) and 14,945 (84.9%) eyes, respectively. Efficacy of less than 0.4 and less than 0.5 was achieved in 322 (1.8%) and 506 (2.9%) eyes, respectively. Patients in the low efficacy group (differences compared with the high efficacy group (≥ 0.8), yet were clinically similar (mean differences between groups of 0.7 years, of 0.43 mm in pupil size, of 0.11 D in cylinder, of 0.22 logMAR in preoperative CDVA, of 0.11 mm in optical zone size, of 1.03 D in actual sphere treatment, and of 0.64 D in actual cylinder treatment). The preoperative subjective CDVA had the highest gain (most important to the model). Correlations analysis revealed significantly decreased efficacy with increased age (r = -0.67, P big data from refractive surgeries may be of interest. [J Refract Surg. 2017;33(9):592-597.]. Copyright 2017, SLACK Incorporated.

  10. Animal to human translational paradigms relevant for approach avoidance conflict decision making.

    Science.gov (United States)

    Kirlic, Namik; Young, Jared; Aupperle, Robin L

    2017-09-01

    Avoidance behavior in clinical anxiety disorders is often a decision made in response to approach-avoidance conflict, resulting in a sacrifice of potential rewards to avoid potential negative affective consequences. Animal research has a long history of relying on paradigms related to approach-avoidance conflict to model anxiety-relevant behavior. This approach includes punishment-based conflict, exploratory, and social interaction tasks. There has been a recent surge of interest in the translation of paradigms from animal to human, in efforts to increase generalization of findings and support the development of more effective mental health treatments. This article briefly reviews animal tests related to approach-avoidance conflict and results from lesion and pharmacologic studies utilizing these tests. We then provide a description of translational human paradigms that have been developed to tap into related constructs, summarizing behavioral and neuroimaging findings. Similarities and differences in findings from analogous animal and human paradigms are discussed. Lastly, we highlight opportunities for future research and paradigm development that will support the clinical utility of this translational work. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Serbian translation of the 20-item toronto alexithymia scale: Psychometric properties and the new methodological approach in translating scales

    Directory of Open Access Journals (Sweden)

    Trajanović Nikola N.

    2013-01-01

    Full Text Available Introduction. Since inception of the alexithymia construct in 1970’s, there has been a continuous effort to improve both its theoretical postulates and the clinical utility through development, standardization and validation of assessment scales. Objective. The aim of this study was to validate the Serbian translation of the 20-item Toronto Alexithymia Scale (TAS-20 and to propose a new method of translation of scales with a property of temporal stability. Methods. The scale was expertly translated by bilingual medical professionals and a linguist, and given to a sample of bilingual participants from the general population who completed both the English and the Serbian version of the scale one week apart. Results. The findings showed that the Serbian version of the TAS-20 had a good internal consistency reliability regarding total scale (α=0.86, and acceptable reliability of the three factors (α=0.71-0.79. Conclusion. The analysis confirmed the validity and consistency of the Serbian translation of the scale, with observed weakness of the factorial structure consistent with studies in other languages. The results also showed that the method of utilizing a self-control bilingual subject is a useful alternative to the back-translation method, particularly in cases of linguistically and structurally sensitive scales, or in cases where a larger sample is not available. This method, dubbed as ‘forth-translation’, could be used to translate psychometric scales measuring properties which have temporal stability over the period of at least several weeks.

  12. A machine learning approach for predicting the relationship between energy resources and economic development

    Science.gov (United States)

    Cogoljević, Dušan; Alizamir, Meysam; Piljan, Ivan; Piljan, Tatjana; Prljić, Katarina; Zimonjić, Stefan

    2018-04-01

    The linkage between energy resources and economic development is a topic of great interest. Research in this area is also motivated by contemporary concerns about global climate change, carbon emissions fluctuating crude oil prices, and the security of energy supply. The purpose of this research is to develop and apply the machine learning approach to predict gross domestic product (GDP) based on the mix of energy resources. Our results indicate that GDP predictive accuracy can be improved slightly by applying a machine learning approach.

  13. Machine Learning Approaches for Predicting Human Skin Sensitization Hazard

    Science.gov (United States)

    One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary for a substance to elicit a skin sensitization reaction suggests that no single in chemico, in vit...

  14. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach

    Directory of Open Access Journals (Sweden)

    Miraemiliana Murat

    2017-09-01

    Full Text Available Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM, Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD, Histogram of Oriented Gradients (HOG, Hu invariant moments (Hu and Zernike moments (ZM. Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN, random forest (RF, support vector machine (SVM, k-nearest neighbour (k-NN, linear discriminant analysis (LDA and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM. In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS and Pearson’s coefficient correlation (PCC. The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia

  15. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach

    OpenAIRE

    Wiljer David; Webster Fiona; Brouwers Melissa C; Légaré France; Gagliardi Anna R; Badley Elizabeth; Straus Sharon

    2011-01-01

    Abstract Background Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. Methods A preliminary conceptual framework for patient-mediated KT interv...

  16. MODELS OF LIVE MIGRATION WITH ITERATIVE APPROACH AND MOVE OF VIRTUAL MACHINES

    Directory of Open Access Journals (Sweden)

    S. M. Aleksankov

    2015-11-01

    Full Text Available Subject of Research. The processes of live migration without shared storage with pre-copy approach and move migration are researched. Migration of virtual machines is an important opportunity of virtualization technology. It enables applications to move transparently with their runtime environments between physical machines. Live migration becomes noticeable technology for efficient load balancing and optimizing the deployment of virtual machines to physical hosts in data centres. Before the advent of live migration, only network migration (the so-called, «Move», has been used, that entails stopping the virtual machine execution while copying to another physical server, and, consequently, unavailability of the service. Method. Algorithms of live migration without shared storage with pre-copy approach and move migration of virtual machines are reviewed from the perspective of research of migration time and unavailability of services at migrating of virtual machines. Main Results. Analytical models are proposed predicting migration time of virtual machines and unavailability of services at migrating with such technologies as live migration with pre-copy approach without shared storage and move migration. In the latest works on the time assessment of unavailability of services and migration time using live migration without shared storage experimental results are described, that are applicable to draw general conclusions about the changes of time for unavailability of services and migration time, but not to predict their values. Practical Significance. The proposed models can be used for predicting the migration time and time of unavailability of services, for example, at implementation of preventive and emergency works on the physical nodes in data centres.

  17. Translational Creativity

    DEFF Research Database (Denmark)

    Nielsen, Sandro

    2010-01-01

    A long-established approach to legal translation focuses on terminological equivalence making translators strictly follow the words of source texts. Recent research suggests that there is room for some creativity allowing translators to deviate from the source texts. However, little attention...... is given to genre conventions in source texts and the ways in which they can best be translated. I propose that translators of statutes with an informative function in expert-to-expert communication may be allowed limited translational creativity when translating specific types of genre convention....... This creativity is a result of translators adopting either a source-language or a target-language oriented strategy and is limited by the pragmatic principle of co-operation. Examples of translation options are provided illustrating the different results in target texts. The use of a target-language oriented...

  18. Ultrasonic fluid quantity measurement in dynamic vehicular applications a support vector machine approach

    CERN Document Server

    Terzic, Jenny; Nagarajah, Romesh; Alamgir, Muhammad

    2013-01-01

    Accurate fluid level measurement in dynamic environments can be assessed using a Support Vector Machine (SVM) approach. SVM is a supervised learning model that analyzes and recognizes patterns. It is a signal classification technique which has far greater accuracy than conventional signal averaging methods. Ultrasonic Fluid Quantity Measurement in Dynamic Vehicular Applications: A Support Vector Machine Approach describes the research and development of a fluid level measurement system for dynamic environments. The measurement system is based on a single ultrasonic sensor. A Support Vector Machines (SVM) based signal characterization and processing system has been developed to compensate for the effects of slosh and temperature variation in fluid level measurement systems used in dynamic environments including automotive applications. It has been demonstrated that a simple ν-SVM model with Radial Basis Function (RBF) Kernel with the inclusion of a Moving Median filter could be used to achieve the high levels...

  19. Opinion Mining in Latvian Text Using Semantic Polarity Analysis and Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Gatis Špats

    2016-07-01

    Full Text Available In this paper we demonstrate approaches for opinion mining in Latvian text. Authors have applied, combined and extended results of several previous studies and public resources to perform opinion mining in Latvian text using two approaches, namely, semantic polarity analysis and machine learning. One of the most significant constraints that make application of opinion mining for written content classification in Latvian text challenging is the limited publicly available text corpora for classifier training. We have joined several sources and created a publically available extended lexicon. Our results are comparable to or outperform current achievements in opinion mining in Latvian. Experiments show that lexicon-based methods provide more accurate opinion mining than the application of Naive Bayes machine learning classifier on Latvian tweets. Methods used during this study could be further extended using human annotators, unsupervised machine learning and bootstrapping to create larger corpora of classified text.

  20. Workforce Optimization for Bank Operation Centers: A Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Sefik Ilkin Serengil

    2017-12-01

    Full Text Available Online Banking Systems evolved and improved in recent years with the use of mobile and online technologies, performing money transfer transactions on these channels can be done without delay and human interaction, however commercial customers still tend to transfer money on bank branches due to several concerns. Bank Operation Centers serve to reduce the operational workload of branches. Centralized management also offers personalized service by appointed expert employees in these centers. Inherently, workload volume of money transfer transactions changes dramatically in hours. Therefore, work-force should be planned instantly or early to save labor force and increase operational efficiency. This paper introduces a hybrid multi stage approach for workforce planning in bank operation centers by the application of supervised and unsu-pervised learning algorithms. Expected workload would be predicted as supervised learning whereas employees are clus-tered into different skill groups as unsupervised learning to match transactions and proper employees. Finally, workforce optimization is analyzed for proposed approach on production data.

  1. Synthesizing Marketing, Community Engagement, and Systems Science Approaches for Advancing Translational Research.

    Science.gov (United States)

    Kneipp, Shawn M; Leeman, Jennifer; McCall, Pamela; Hassmiller-Lich, Kristen; Bobashev, Georgiy; Schwartz, Todd A; Gilmore, Robert; Riggan, Scott; Gil, Benjamin

    2015-01-01

    The adoption and implementation of evidence-based interventions (EBIs) are the goals of translational research; however, potential end-users' perceptions of an EBI value have contributed to low rates of adoption. In this article, we describe our application of emerging dissemination and implementation science theoretical perspectives, community engagement, and systems science principles to develop a novel EBI dissemination approach. Using consumer-driven, graphics-rich simulation, the approach demonstrates predicted implementation effects on health and employment outcomes for socioeconomically disadvantaged women at the local level and is designed to increase adoption interest of county program managers accountable for improving these outcomes in their communities.

  2. A New Approach to Spindle Radial Error Evaluation Using a Machine Vision System

    Directory of Open Access Journals (Sweden)

    Kavitha C.

    2017-03-01

    Full Text Available The spindle rotational accuracy is one of the important issues in a machine tool which affects the surface topography and dimensional accuracy of a workpiece. This paper presents a machine-vision-based approach to radial error measurement of a lathe spindle using a CMOS camera and a PC-based image processing system. In the present work, a precisely machined cylindrical master is mounted on the spindle as a datum surface and variations of its position are captured using the camera for evaluating runout of the spindle. The Circular Hough Transform (CHT is used to detect variations of the centre position of the master cylinder during spindle rotation at subpixel level from a sequence of images. Radial error values of the spindle are evaluated using the Fourier series analysis of the centre position of the master cylinder calculated with the least squares curve fitting technique. The experiments have been carried out on a lathe at different operating speeds and the spindle radial error estimation results are presented. The proposed method provides a simpler approach to on-machine estimation of the spindle radial error in machine tools.

  3. Process planning optimization on turning machine tool using a hybrid genetic algorithm with local search approach

    Directory of Open Access Journals (Sweden)

    Yuliang Su

    2015-04-01

    Full Text Available A turning machine tool is a kind of new type of machine tool that is equipped with more than one spindle and turret. The distinctive simultaneous and parallel processing abilities of turning machine tool increase the complexity of process planning. The operations would not only be sequenced and satisfy precedence constraints, but also should be scheduled with multiple objectives such as minimizing machining cost, maximizing utilization of turning machine tool, and so on. To solve this problem, a hybrid genetic algorithm was proposed to generate optimal process plans based on a mixed 0-1 integer programming model. An operation precedence graph is used to represent precedence constraints and help generate a feasible initial population of hybrid genetic algorithm. Encoding strategy based on data structure was developed to represent process plans digitally in order to form the solution space. In addition, a local search approach for optimizing the assignments of available turrets would be added to incorporate scheduling with process planning. A real-world case is used to prove that the proposed approach could avoid infeasible solutions and effectively generate a global optimal process plan.

  4. Machine learning approaches for the prediction of signal peptides and otherprotein sorting signals

    DEFF Research Database (Denmark)

    Nielsen, Henrik; Brunak, Søren; von Heijne, Gunnar

    1999-01-01

    Prediction of protein sorting signals from the sequence of amino acids has great importance in the field of proteomics today. Recently,the growth of protein databases, combined with machine learning approaches, such as neural networks and hidden Markov models, havemade it possible to achieve...

  5. New approaches in mathematical biology: Information theory and molecular machines

    International Nuclear Information System (INIS)

    Schneider, T.

    1995-01-01

    My research uses classical information theory to study genetic systems. Information theory was founded by Claude Shannon in the 1940's and has had an enormous impact on communications engineering and computer sciences. Shannon found a way to measure information. This measure can be used to precisely characterize the sequence conservation at nucleic-acid binding sites. The resulting methods, by completely replacing the use of ''consensus sequences'', provide better models for molecular biologists. An excess of conservation led us to do experimental work on bacteriophage T7 promoters and the F plasmid IncD repeats. The wonderful fidelity of telephone communications and compact disk (CD) music can be traced directly to Shannon's channel capacity theorem. When rederived for molecular biology, this theorem explains the surprising precision of many molecular events. Through connections with the Second Law of Thermodyanmics and Maxwell's Demon, this approach also has implications for the development of technology at the molecular level. Discussions of these topics are held on the internet news group bionet.info-theo. (author). (Abstract only)

  6. Facial Emotion Recognition System – A Machine Learning Approach

    Science.gov (United States)

    Ramalingam, V. V.; Pandian, A.; Jayakumar, Lavanya

    2018-04-01

    Frown is a medium for people correlation and it could be exercised in multiple real systems. Single crucial stage for frown realizing is to exactly select hysterical aspects. This journal proposed a frown realization scheme applying transformative Particle Swarm Optimization (PSO) based aspect accumulation. This entity initially employs changed LVP, handles crisscross adjacent picture element contrast, for achieving the selective first frown portrayal. Then the PSO entity inserted with a concept of micro Genetic Algorithm (mGA) called mGA-embedded PSO designed for achieving aspect accumulation. This study, the technique subsumes no disposable memory, a little-populace insignificant flock, a latest acceleration that amends with the approach and a sub dimension-based in-depth local frown aspect examines. Assistance of provincial utilization and comprehensive inspection examine structure of alleviating of an immature concurrence complication of conventional PSO. Numerous identifiers are used to diagnose different frown expositions. Stationed on extensive study within and other-sphere pictures from the continued Cohn Kanade and MMI benchmark directory appropriately. Determination of the application exceeds most advanced level PSO variants, conventional PSO, classical GA and alternate relevant frown realization structures is described with powerful limit. Extending our accession to a motion based FER application for connecting patch-based Gabor aspects with continuous data in multi-frames.

  7. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  8. A Machine Approach for Field Weakening of Permanent-Magnet Motors

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, J.S.

    2000-04-02

    The commonly known technology of field weakening for permanent-magnet (PM) motors is achieved by controlling the direct-axis current component through an inverter, without using mechanical variation of the air gap, a new machine approach for field weakening of PM machines by direct control of air-gap fluxes is introduced. The demagnetization situation due to field weakening is not an issue with this new method. In fact, the PMs are strengthened at field weakening. The field-weakening ratio can reach 1O:1 or higher. This technology is particularly useful for the PM generators and electric vehicle drives.

  9. An ensemble machine learning approach to predict survival in breast cancer.

    Science.gov (United States)

    Djebbari, Amira; Liu, Ziying; Phan, Sieu; Famili, Fazel

    2008-01-01

    Current breast cancer predictive signatures are not unique. Can we use this fact to our advantage to improve prediction? From the machine learning perspective, it is well known that combining multiple classifiers can improve classification performance. We propose an ensemble machine learning approach which consists of choosing feature subsets and learning predictive models from them. We then combine models based on certain model fusion criteria and we also introduce a tuning parameter to control sensitivity. Our method significantly improves classification performance with a particular emphasis on sensitivity which is critical to avoid misclassifying poor prognosis patients as good prognosis.

  10. Visible Machine Learning for Biomedicine.

    Science.gov (United States)

    Yu, Michael K; Ma, Jianzhu; Fisher, Jasmin; Kreisberg, Jason F; Raphael, Benjamin J; Ideker, Trey

    2018-06-14

    A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology. Copyright © 2018. Published by Elsevier Inc.

  11. A Machine Learning Approach to Automated Gait Analysis for the Noldus Catwalk System.

    Science.gov (United States)

    Frohlich, Holger; Claes, Kasper; De Wolf, Catherine; Van Damme, Xavier; Michel, Anne

    2018-05-01

    Gait analysis of animal disease models can provide valuable insights into in vivo compound effects and thus help in preclinical drug development. The purpose of this paper is to establish a computational gait analysis approach for the Noldus Catwalk system, in which footprints are automatically captured and stored. We present a - to our knowledge - first machine learning based approach for the Catwalk system, which comprises a step decomposition, definition and extraction of meaningful features, multivariate step sequence alignment, feature selection, and training of different classifiers (gradient boosting machine, random forest, and elastic net). Using animal-wise leave-one-out cross validation we demonstrate that with our method we can reliable separate movement patterns of a putative Parkinson's disease animal model and several control groups. Furthermore, we show that we can predict the time point after and the type of different brain lesions and can even forecast the brain region, where the intervention was applied. We provide an in-depth analysis of the features involved into our classifiers via statistical techniques for model interpretation. A machine learning method for automated analysis of data from the Noldus Catwalk system was established. Our works shows the ability of machine learning to discriminate pharmacologically relevant animal groups based on their walking behavior in a multivariate manner. Further interesting aspects of the approach include the ability to learn from past experiments, improve with more data arriving and to make predictions for single animals in future studies.

  12. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    Science.gov (United States)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  13. A fuzzy regression with support vector machine approach to the estimation of horizontal global solar radiation

    International Nuclear Information System (INIS)

    Baser, Furkan; Demirhan, Haydar

    2017-01-01

    Accurate estimation of the amount of horizontal global solar radiation for a particular field is an important input for decision processes in solar radiation investments. In this article, we focus on the estimation of yearly mean daily horizontal global solar radiation by using an approach that utilizes fuzzy regression functions with support vector machine (FRF-SVM). This approach is not seriously affected by outlier observations and does not suffer from the over-fitting problem. To demonstrate the utility of the FRF-SVM approach in the estimation of horizontal global solar radiation, we conduct an empirical study over a dataset collected in Turkey and applied the FRF-SVM approach with several kernel functions. Then, we compare the estimation accuracy of the FRF-SVM approach to an adaptive neuro-fuzzy system and a coplot supported-genetic programming approach. We observe that the FRF-SVM approach with a Gaussian kernel function is not affected by both outliers and over-fitting problem and gives the most accurate estimates of horizontal global solar radiation among the applied approaches. Consequently, the use of hybrid fuzzy functions and support vector machine approaches is found beneficial in long-term forecasting of horizontal global solar radiation over a region with complex climatic and terrestrial characteristics. - Highlights: • A fuzzy regression functions with support vector machines approach is proposed. • The approach is robust against outlier observations and over-fitting problem. • Estimation accuracy of the model is superior to several existent alternatives. • A new solar radiation estimation model is proposed for the region of Turkey. • The model is useful under complex terrestrial and climatic conditions.

  14. Machine Learning Based Classification of Microsatellite Variation: An Effective Approach for Phylogeographic Characterization of Olive Populations.

    Science.gov (United States)

    Torkzaban, Bahareh; Kayvanjoo, Amir Hossein; Ardalan, Arman; Mousavi, Soraya; Mariotti, Roberto; Baldoni, Luciana; Ebrahimie, Esmaeil; Ebrahimi, Mansour; Hosseini-Mazinani, Mehdi

    2015-01-01

    Finding efficient analytical techniques is overwhelmingly turning into a bottleneck for the effectiveness of large biological data. Machine learning offers a novel and powerful tool to advance classification and modeling solutions in molecular biology. However, these methods have been less frequently used with empirical population genetics data. In this study, we developed a new combined approach of data analysis using microsatellite marker data from our previous studies of olive populations using machine learning algorithms. Herein, 267 olive accessions of various origins including 21 reference cultivars, 132 local ecotypes, and 37 wild olive specimens from the Iranian plateau, together with 77 of the most represented Mediterranean varieties were investigated using a finely selected panel of 11 microsatellite markers. We organized data in two '4-targeted' and '16-targeted' experiments. A strategy of assaying different machine based analyses (i.e. data cleaning, feature selection, and machine learning classification) was devised to identify the most informative loci and the most diagnostic alleles to represent the population and the geography of each olive accession. These analyses revealed microsatellite markers with the highest differentiating capacity and proved efficiency for our method of clustering olive accessions to reflect upon their regions of origin. A distinguished highlight of this study was the discovery of the best combination of markers for better differentiating of populations via machine learning models, which can be exploited to distinguish among other biological populations.

  15. Translational research strategy: an essential approach to fight the spread of antimicrobial resistance.

    Science.gov (United States)

    Tacconelli, Evelina; Peschel, Andreas; Autenrieth, Ingo B

    2014-11-01

    Translation research strategy in infectious diseases, combining the results from basic research with patient-orientated research, aims to bridge the gap between laboratory findings and clinical infectious disease practice to improve disease management. In an era of increasing antimicrobial resistance, there are four main areas of clinical and scientific uncertainty that need to be urgently addressed by translational research: (i) early diagnosis of antibiotic-resistant infections and the appropriateness of empirical antibiotic therapy; (ii) the identification of reservoirs of antibiotic-resistant pathogens; (iii) the development of new antibiotics with lower propensities to evoke resistance; and (iv) the development of new non-antibiotic drugs to be used in the prevention of the spread of resistant bacterial strains. Strict European collaboration among major stakeholders is therefore essential. Appropriate educational tools to train a new generation of scientists with regard to a multifaceted approach to antimicrobial resistance research should be developed. Key areas include the support and implementation of European networks focused on translational research and related education activities, making potential therapeutics more attractive to investors and helping academic investigators to determine whether new molecules can be developed with clinical applicability. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    Science.gov (United States)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  17. Classification of follicular lymphoma images: a holistic approach with symbol-based machine learning methods.

    Science.gov (United States)

    Zorman, Milan; Sánchez de la Rosa, José Luis; Dinevski, Dejan

    2011-12-01

    It is not very often to see a symbol-based machine learning approach to be used for the purpose of image classification and recognition. In this paper we will present such an approach, which we first used on the follicular lymphoma images. Lymphoma is a broad term encompassing a variety of cancers of the lymphatic system. Lymphoma is differentiated by the type of cell that multiplies and how the cancer presents itself. It is very important to get an exact diagnosis regarding lymphoma and to determine the treatments that will be most effective for the patient's condition. Our work was focused on the identification of lymphomas by finding follicles in microscopy images provided by the Laboratory of Pathology in the University Hospital of Tenerife, Spain. We divided our work in two stages: in the first stage we did image pre-processing and feature extraction, and in the second stage we used different symbolic machine learning approaches for pixel classification. Symbolic machine learning approaches are often neglected when looking for image analysis tools. They are not only known for a very appropriate knowledge representation, but also claimed to lack computational power. The results we got are very promising and show that symbolic approaches can be successful in image analysis applications.

  18. An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.

    Science.gov (United States)

    Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein

    2017-12-22

    The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.

  19. A cloud-based data network approach for translational cancer research.

    Science.gov (United States)

    Xing, Wei; Tsoumakos, Dimitrios; Ghanem, Moustafa

    2015-01-01

    We develop a new model and associated technology for constructing and managing self-organizing data to support translational cancer research studies. We employ a semantic content network approach to address the challenges of managing cancer research data. Such data is heterogeneous, large, decentralized, growing and continually being updated. Moreover, the data originates from different information sources that may be partially overlapping, creating redundancies as well as contradictions and inconsistencies. Building on the advantages of elasticity of cloud computing, we deploy the cancer data networks on top of the CELAR Cloud platform to enable more effective processing and analysis of Big cancer data.

  20. Machine learning approach for the outcome prediction of temporal lobe epilepsy surgery.

    Directory of Open Access Journals (Sweden)

    Rubén Armañanzas

    Full Text Available Epilepsy surgery is effective in reducing both the number and frequency of seizures, particularly in temporal lobe epilepsy (TLE. Nevertheless, a significant proportion of these patients continue suffering seizures after surgery. Here we used a machine learning approach to predict the outcome of epilepsy surgery based on supervised classification data mining taking into account not only the common clinical variables, but also pathological and neuropsychological evaluations. We have generated models capable of predicting whether a patient with TLE secondary to hippocampal sclerosis will fully recover from epilepsy or not. The machine learning analysis revealed that outcome could be predicted with an estimated accuracy of almost 90% using some clinical and neuropsychological features. Importantly, not all the features were needed to perform the prediction; some of them proved to be irrelevant to the prognosis. Personality style was found to be one of the key features to predict the outcome. Although we examined relatively few cases, findings were verified across all data, showing that the machine learning approach described in the present study may be a powerful method. Since neuropsychological assessment of epileptic patients is a standard protocol in the pre-surgical evaluation, we propose to include these specific psychological tests and machine learning tools to improve the selection of candidates for epilepsy surgery.

  1. Understanding factors associated with the translation of cardiovascular research: a multinational case study approach

    Science.gov (United States)

    2014-01-01

    Background Funders of health research increasingly seek to understand how best to allocate resources in order to achieve maximum value from their funding. We built an international consortium and developed a multinational case study approach to assess benefits arising from health research. We used that to facilitate analysis of factors in the production of research that might be associated with translating research findings into wider impacts, and the complexities involved. Methods We built on the Payback Framework and expanded its application through conducting co-ordinated case studies on the payback from cardiovascular and stroke research in Australia, Canada and the United Kingdom. We selected a stratified random sample of projects from leading medical research funders. We devised a series of innovative steps to: minimize the effect of researcher bias; rate the level of impacts identified in the case studies; and interrogate case study narratives to identify factors that correlated with achieving high or low levels of impact. Results Twenty-nine detailed case studies produced many and diverse impacts. Over the 15 to 20 years examined, basic biomedical research has a greater impact than clinical research in terms of academic impacts such as knowledge production and research capacity building. Clinical research has greater levels of wider impact on health policies, practice, and generating health gains. There was no correlation between knowledge production and wider impacts. We identified various factors associated with high impact. Interaction between researchers and practitioners and the public is associated with achieving high academic impact and translation into wider impacts, as is basic research conducted with a clinical focus. Strategic thinking by clinical researchers, in terms of thinking through pathways by which research could potentially be translated into practice, is associated with high wider impact. Finally, we identified the complexity of

  2. A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting

    OpenAIRE

    Zhaoxuan Li; SM Mahbobur Rahman; Rolando Vega; Bing Dong

    2016-01-01

    We evaluate and compare two common methods, artificial neural networks (ANN) and support vector regression (SVR), for predicting energy productions from a solar photovoltaic (PV) system in Florida 15 min, 1 h and 24 h ahead of time. A hierarchical approach is proposed based on the machine learning algorithms tested. The production data used in this work corresponds to 15 min averaged power measurements collected from 2014. The accuracy of the model is determined using computing error statisti...

  3. A Review of Machine Learning and Data Mining Approaches for Business Applications in Social Networks

    OpenAIRE

    Evis Trandafili; Marenglen Biba

    2013-01-01

    Social networks have an outstanding marketing value and developing data mining methods for viral marketing is a hot topic in the research community. However, most social networks remain impossible to be fully analyzed and understood due to prohibiting sizes and the incapability of traditional machine learning and data mining approaches to deal with the new dimension in the learning process related to the large-scale environment where the data are produced. On one hand, the birth and evolution...

  4. Machine learning approaches to analysing textual injury surveillance data: a systematic review.

    Science.gov (United States)

    Vallmuur, Kirsten

    2015-06-01

    To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Systematic review. The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality

  5. A new approach for translating strategic healthcare objectives into operational indicators

    DEFF Research Database (Denmark)

    Traberg, Andreas; Jacobsen, Peter

    2009-01-01

    The purpose of this paper is to propose a new performance measurement approach which enables healthcare managers to design a performance management system tailored for their individual settings. The model is based on the strategic goal of the individual health care facility. It has been developed...... level, a detailed and well-defined performance measurement structure is connected to the overall strategic plan The increasing complexity in modern healthcare requires new improved performance management systems for healthcare institutions (Landrum & Baker 2004). The process of translating strategic......). To be able to coordinate and manage these different requirements, a performance management system, encompassing performance indicators from all the three stakeholder groups is needed. Our approach was derived using the action research methodology (Coughlan & Coghlan 2002). The work is based on a two year...

  6. A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills

    Directory of Open Access Journals (Sweden)

    Elena Berg

    2016-12-01

    Full Text Available A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills This paper discusses the importance of a cognitive approach to the evaluation of translator’s skills. The authors set forth their recommendations for the compilation of test materials for the evaluation of translators’ cognitive ability.   Kognitywne podejście do kompilowania tekstów służących ocenie umiejętności tłumacza Artykuł porusza wagę kognitywnego podejścia do ewaluacji umiejętności tłumacza. Autorzy przedstawiają swoje zalecenia co do kompilowania materiałów testowych do ewaluacji kognitywnych zdolności tłumacza.

  7. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  8. Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches

    Science.gov (United States)

    Klump, J. F.; Fouedjio, F.

    2017-12-01

    Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.

  9. Machine learning approaches to the social determinants of health in the health and retirement study.

    Science.gov (United States)

    Seligman, Benjamin; Tuljapurkar, Shripad; Rehkopf, David

    2018-04-01

    Social and economic factors are important predictors of health and of recognized importance for health systems. However, machine learning, used elsewhere in the biomedical literature, has not been extensively applied to study relationships between society and health. We investigate how machine learning may add to our understanding of social determinants of health using data from the Health and Retirement Study. A linear regression of age and gender, and a parsimonious theory-based regression additionally incorporating income, wealth, and education, were used to predict systolic blood pressure, body mass index, waist circumference, and telomere length. Prediction, fit, and interpretability were compared across four machine learning methods: linear regression, penalized regressions, random forests, and neural networks. All models had poor out-of-sample prediction. Most machine learning models performed similarly to the simpler models. However, neural networks greatly outperformed the three other methods. Neural networks also had good fit to the data ( R 2 between 0.4-0.6, versus learning models, nine variables were frequently selected or highly weighted as predictors: dental visits, current smoking, self-rated health, serial-seven subtractions, probability of receiving an inheritance, probability of leaving an inheritance of at least $10,000, number of children ever born, African-American race, and gender. Some of the machine learning methods do not improve prediction or fit beyond simpler models, however, neural networks performed well. The predictors identified across models suggest underlying social factors that are important predictors of biological indicators of chronic disease, and that the non-linear and interactive relationships between variables fundamental to the neural network approach may be important to consider.

  10. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  11. Identifying Green Infrastructure from Social Media and Crowdsourcing- An Image Based Machine-Learning Approach.

    Science.gov (United States)

    Rai, A.; Minsker, B. S.

    2016-12-01

    In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.

  12. Application of Machine Learning Approaches for Classifying Sitting Posture Based on Force and Acceleration Sensors

    Directory of Open Access Journals (Sweden)

    Roland Zemp

    2016-01-01

    Full Text Available Occupational musculoskeletal disorders, particularly chronic low back pain (LBP, are ubiquitous due to prolonged static sitting or nonergonomic sitting positions. Therefore, the aim of this study was to develop an instrumented chair with force and acceleration sensors to determine the accuracy of automatically identifying the user’s sitting position by applying five different machine learning methods (Support Vector Machines, Multinomial Regression, Boosting, Neural Networks, and Random Forest. Forty-one subjects were requested to sit four times in seven different prescribed sitting positions (total 1148 samples. Sixteen force sensor values and the backrest angle were used as the explanatory variables (features for the classification. The different classification methods were compared by means of a Leave-One-Out cross-validation approach. The best performance was achieved using the Random Forest classification algorithm, producing a mean classification accuracy of 90.9% for subjects with which the algorithm was not familiar. The classification accuracy varied between 81% and 98% for the seven different sitting positions. The present study showed the possibility of accurately classifying different sitting positions by means of the introduced instrumented office chair combined with machine learning analyses. The use of such novel approaches for the accurate assessment of chair usage could offer insights into the relationships between sitting position, sitting behaviour, and the occurrence of musculoskeletal disorders.

  13. Application of Machine Learning Approaches for Classifying Sitting Posture Based on Force and Acceleration Sensors.

    Science.gov (United States)

    Zemp, Roland; Tanadini, Matteo; Plüss, Stefan; Schnüriger, Karin; Singh, Navrag B; Taylor, William R; Lorenzetti, Silvio

    2016-01-01

    Occupational musculoskeletal disorders, particularly chronic low back pain (LBP), are ubiquitous due to prolonged static sitting or nonergonomic sitting positions. Therefore, the aim of this study was to develop an instrumented chair with force and acceleration sensors to determine the accuracy of automatically identifying the user's sitting position by applying five different machine learning methods (Support Vector Machines, Multinomial Regression, Boosting, Neural Networks, and Random Forest). Forty-one subjects were requested to sit four times in seven different prescribed sitting positions (total 1148 samples). Sixteen force sensor values and the backrest angle were used as the explanatory variables (features) for the classification. The different classification methods were compared by means of a Leave-One-Out cross-validation approach. The best performance was achieved using the Random Forest classification algorithm, producing a mean classification accuracy of 90.9% for subjects with which the algorithm was not familiar. The classification accuracy varied between 81% and 98% for the seven different sitting positions. The present study showed the possibility of accurately classifying different sitting positions by means of the introduced instrumented office chair combined with machine learning analyses. The use of such novel approaches for the accurate assessment of chair usage could offer insights into the relationships between sitting position, sitting behaviour, and the occurrence of musculoskeletal disorders.

  14. The stage of change approach for implementing ergonomics advice - Translating research into practice.

    Science.gov (United States)

    Rothmore, Paul; Aylward, Paul; Oakman, Jodi; Tappin, David; Gray, Jodi; Karnon, Jonathan

    2017-03-01

    The Stage of Change (SOC) approach has been proposed as a method to improve the implementation of ergonomics advice. However, despite evidence for its efficacy there is little evidence to suggest it has been adopted by ergonomics consultants. This paper investigates barriers and facilitators to the implementation, monitoring and effectiveness of ergonomics advice and the adoption of the SOC approach in a series of focus groups and a subsequent survey of members of the Human Factors Societies of Australia and New Zealand. A proposed SOC assessment tool developed for use by ergonomics practitioners is presented. Findings from this study suggest the limited application of a SOC based approach to work-related musculoskeletal injury prevention by ergonomics practitioners is due to the absence of a suitable tool in the ergonomists' repertoire, the need for training in this approach, and their limited access to relevant research findings. The final translation of the SOC assessment tool into professional ergonomics practice will require accessible demonstration of its real-world usability to practitioners and the training of ergonomics practitioners in its application. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Big Data Meets Quantum Chemistry Approximations: The Δ-Machine Learning Approach.

    Science.gov (United States)

    Ramakrishnan, Raghunathan; Dral, Pavlo O; Rupp, Matthias; von Lilienfeld, O Anatole

    2015-05-12

    Chemically accurate and comprehensive studies of the virtual space of all possible molecules are severely limited by the computational cost of quantum chemistry. We introduce a composite strategy that adds machine learning corrections to computationally inexpensive approximate legacy quantum methods. After training, highly accurate predictions of enthalpies, free energies, entropies, and electron correlation energies are possible, for significantly larger molecular sets than used for training. For thermochemical properties of up to 16k isomers of C7H10O2 we present numerical evidence that chemical accuracy can be reached. We also predict electron correlation energy in post Hartree-Fock methods, at the computational cost of Hartree-Fock, and we establish a qualitative relationship between molecular entropy and electron correlation. The transferability of our approach is demonstrated, using semiempirical quantum chemistry and machine learning models trained on 1 and 10% of 134k organic molecules, to reproduce enthalpies of all remaining molecules at density functional theory level of accuracy.

  16. Two Approaches for the Management of Virtual Machines on Grid Infrastructures

    International Nuclear Information System (INIS)

    Tapiador, D.; Rubio-Montero, A. J.; Juedo, E.; Montero, R. S.; Llorente, I. M.

    2007-01-01

    Virtual machines are a promising technology to overcome some of the problems found in current Grid infrastructures, like heterogeneity, performance partitioning or application isolation. This work shows a comparison between two strategies to manage virtual machines in Globus Grids. The first alternative is a straightforward deployment that does not require additional middle ware to be installed. It is only based on standard Grid services and is not bound to a given virtualization technology. Although this option is fully functional, it is only suitable for single process batch jobs. The second solution makes use of the Virtual Workspace Service which allows a remote client to securely negotiate and manage a virtual resource. This approach better exploits the potential benefits offered by the virtualization technology and provides a wider application range. (Author)

  17. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    Science.gov (United States)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  18. Translation and genetic criticism : genetic and editorial approaches to the 'untranslatable' in Joyce and Beckett

    OpenAIRE

    Hulle, Van, Dirk

    2015-01-01

    Abstract: Genetics of translation may suggest a unidirectional link between two fields of research (genetic criticism applied to translation), but there are many ways in which translation and genetic criticism interact. This article's research hypothesis is that an exchange of ideas between translation studies and genetic criticism can be mutually beneficial in more than one way. The main function of this exchange is to enhance a form of textual awareness, and to realize this enhanced textual...

  19. Stakeholder Meeting: Integrated Knowledge Translation Approach to Address the Caregiver Support Gap.

    Science.gov (United States)

    Holroyd-Leduc, Jayna M; McMillan, Jacqueline; Jette, Nathalie; Brémault-Phillips, Suzette C; Duggleby, Wendy; Hanson, Heather M; Parmar, Jasneet

    2017-03-01

    Family caregivers are an integral and increasingly overburdened part of the health care system. There is a gap between what research evidence shows is beneficial to caregivers and what is actually provided. Using an integrated knowledge translation approach, a stakeholder meeting was held among researchers, family caregivers, caregiver associations, clinicians, health care administrators, and policy makers. The objectives of the meeting were to review current research evidence and conduct multi-stakeholder dialogue on the potential gaps, facilitators, and barriers to the provision of caregiver supports. A two-day meeting was attended by 123 individuals. Three target populations of family caregivers were identified for discussion: caregivers of seniors with dementia, caregivers in end-of-life care, and caregivers of frail seniors with complex health needs. The results of this meeting can and are being used to inform the development of implementation research endeavours and policies targeted at providing evidence-informed caregiver supports.

  20. A Cultural Approach to English Translating Strategies of Chinese Cuisine names

    Institute of Scientific and Technical Information of China (English)

    张昆鹏; 魏天婵

    2011-01-01

    Chinese food is not only characterized with its special cooking methods but its cultural implications.However,the status quo of English translation of Chinese dish names is not satisfying.For the purpose of spreading Chinese cuisine culture,4 translating principles and several translating methods are put forward in order to promote the exchanging between cultures.

  1. Designing System Reforms: Using a Systems Approach to Translate Incident Analyses into Prevention Strategies

    Science.gov (United States)

    Goode, Natassia; Read, Gemma J. M.; van Mulken, Michelle R. H.; Clacy, Amanda; Salmon, Paul M.

    2016-01-01

    Advocates of systems thinking approaches argue that accident prevention strategies should focus on reforming the system rather than on fixing the “broken components.” However, little guidance exists on how organizations can translate incident data into prevention strategies that address the systemic causes of accidents. This article describes and evaluates a series of systems thinking prevention strategies that were designed in response to the analysis of multiple incidents. The study was undertaken in the led outdoor activity (LOA) sector in Australia, which delivers supervised or instructed outdoor activities such as canyoning, sea kayaking, rock climbing and camping. The design process involved workshops with practitioners, and focussed on incident data analyzed using Rasmussen's AcciMap technique. A series of reflection points based on the systemic causes of accidents was used to guide the design process, and the AcciMap technique was used to represent the prevention strategies and the relationships between them, leading to the creation of PreventiMaps. An evaluation of the PreventiMaps revealed that all of them incorporated the core principles of the systems thinking approach and many proposed prevention strategies for improving vertical integration across the LOA system. However, the majority failed to address the migration of work practices and the erosion of risk controls. Overall, the findings suggest that the design process was partially successful in helping practitioners to translate incident data into prevention strategies that addressed the systemic causes of accidents; refinement of the design process is required to focus practitioners more on designing monitoring and feedback mechanisms to support decisions at the higher levels of the system. PMID:28066296

  2. Translating India

    CERN Document Server

    Kothari, Rita

    2014-01-01

    The cultural universe of urban, English-speaking middle class in India shows signs of growing inclusiveness as far as English is concerned. This phenomenon manifests itself in increasing forms of bilingualism (combination of English and one Indian language) in everyday forms of speech - advertisement jingles, bilingual movies, signboards, and of course conversations. It is also evident in the startling prominence of Indian Writing in English and somewhat less visibly, but steadily rising, activity of English translation from Indian languages. Since the eighties this has led to a frenetic activity around English translation in India's academic and literary circles. Kothari makes this very current phenomenon her chief concern in Translating India.   The study covers aspects such as the production, reception and marketability of English translation. Through an unusually multi-disciplinary approach, this study situates English translation in India amidst local and global debates on translation, representation an...

  3. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    Science.gov (United States)

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.

  4. A machine learning approach to galaxy-LSS classification - I. Imprints on halo merger trees

    Science.gov (United States)

    Hui, Jianan; Aragon, Miguel; Cui, Xinping; Flegal, James M.

    2018-04-01

    The cosmic web plays a major role in the formation and evolution of galaxies and defines, to a large extent, their properties. However, the relation between galaxies and environment is still not well understood. Here, we present a machine learning approach to study imprints of environmental effects on the mass assembly of haloes. We present a galaxy-LSS machine learning classifier based on galaxy properties sensitive to the environment. We then use the classifier to assess the relevance of each property. Correlations between galaxy properties and their cosmic environment can be used to predict galaxy membership to void/wall or filament/cluster with an accuracy of 93 per cent. Our study unveils environmental information encoded in properties of haloes not normally considered directly dependent on the cosmic environment such as merger history and complexity. Understanding the physical mechanism by which the cosmic web is imprinted in a halo can lead to significant improvements in galaxy formation models. This is accomplished by extracting features from galaxy properties and merger trees, computing feature scores for each feature and then applying support vector machine (SVM) to different feature sets. To this end, we have discovered that the shape and depth of the merger tree, formation time, and density of the galaxy are strongly associated with the cosmic environment. We describe a significant improvement in the original classification algorithm by performing LU decomposition of the distance matrix computed by the feature vectors and then using the output of the decomposition as input vectors for SVM.

  5. Classification of Breast Cancer Resistant Protein (BCRP) Inhibitors and Non-Inhibitors Using Machine Learning Approaches.

    Science.gov (United States)

    Belekar, Vilas; Lingineni, Karthik; Garg, Prabha

    2015-01-01

    The breast cancer resistant protein (BCRP) is an important transporter and its inhibitors play an important role in cancer treatment by improving the oral bioavailability as well as blood brain barrier (BBB) permeability of anticancer drugs. In this work, a computational model was developed to predict the compounds as BCRP inhibitors or non-inhibitors. Various machine learning approaches like, support vector machine (SVM), k-nearest neighbor (k-NN) and artificial neural network (ANN) were used to develop the models. The Matthews correlation coefficients (MCC) of developed models using ANN, k-NN and SVM are 0.67, 0.71 and 0.77, and prediction accuracies are 85.2%, 88.3% and 90.8% respectively. The developed models were tested with a test set of 99 compounds and further validated with external set of 98 compounds. Distribution plot analysis and various machine learning models were also developed based on druglikeness descriptors. Applicability domain is used to check the prediction reliability of the new molecules.

  6. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    Science.gov (United States)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  7. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach

    Science.gov (United States)

    Tiun, Sabrina; AL-Dhief, Fahad Taha; Sammour, Mahmoud A. M.

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%. PMID:29672546

  8. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach.

    Science.gov (United States)

    Albadr, Musatafa Abbas Abbood; Tiun, Sabrina; Al-Dhief, Fahad Taha; Sammour, Mahmoud A M

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%.

  9. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  10. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  11. Machine Learning Approaches for Detecting Diabetic Retinopathy from Clinical and Public Health Records.

    Science.gov (United States)

    Ogunyemi, Omolola; Kermah, Dulcie

    2015-01-01

    Annual eye examinations are recommended for diabetic patients in order to detect diabetic retinopathy and other eye conditions that arise from diabetes. Medically underserved urban communities in the US have annual screening rates that are much lower than the national average and could benefit from informatics approaches to identify unscreened patients most at risk of developing retinopathy. Using clinical data from urban safety net clinics as well as public health data from the CDC's National Health and Nutrition Examination Survey, we examined different machine learning approaches for predicting retinopathy from clinical or public health data. All datasets utilized exhibited a class imbalance. Classifiers learned on the clinical data were modestly predictive of retinopathy with the best model having an AUC of 0.72, sensitivity of 69.2% and specificity of 55.9%. Classifiers learned on public health data were not predictive of retinopathy. Successful approaches to detecting latent retinopathy using machine learning could help safety net and other clinics identify unscreened patients who are most at risk of developing retinopathy and the use of ensemble classifiers on clinical data shows promise for this purpose.

  12. Tensor Voting A Perceptual Organization Approach to Computer Vision and Machine Learning

    CERN Document Server

    Mordohai, Philippos

    2006-01-01

    This lecture presents research on a general framework for perceptual organization that was conducted mainly at the Institute for Robotics and Intelligent Systems of the University of Southern California. It is not written as a historical recount of the work, since the sequence of the presentation is not in chronological order. It aims at presenting an approach to a wide range of problems in computer vision and machine learning that is data-driven, local and requires a minimal number of assumptions. The tensor voting framework combines these properties and provides a unified perceptual organiza

  13. Designing “Theory of Machines and Mechanisms” course on Project Based Learning approach

    DEFF Research Database (Denmark)

    Shinde, Vikas

    2013-01-01

    by the industry and the learning outcomes specified by the National Board of Accreditation (NBA), India; this course is restructured on Project Based Learning approach. A mini project is designed to suit course objectives. An objective of this paper is to discuss the rationale of this course design......Theory of Machines and Mechanisms course is one of the essential courses of Mechanical Engineering undergraduate curriculum practiced at Indian Institute. Previously, this course was taught by traditional instruction based pedagogy. In order to achieve profession specific skills demanded...... and the process followed to design a project which meets diverse objectives....

  14. Machine-learning approach for local classification of crystalline structures in multiphase systems

    Science.gov (United States)

    Dietz, C.; Kretz, T.; Thoma, M. H.

    2017-07-01

    Machine learning is one of the most popular fields in computer science and has a vast number of applications. In this work we will propose a method that will use a neural network to locally identify crystal structures in a mixed phase Yukawa system consisting of fcc, hcp, and bcc clusters and disordered particles similar to plasma crystals. We compare our approach to already used methods and show that the quality of identification increases significantly. The technique works very well for highly disturbed lattices and shows a flexible and robust way to classify crystalline structures that can be used by only providing particle positions. This leads to insights into highly disturbed crystalline structures.

  15. Convergent functional genomics in addiction research - a translational approach to study candidate genes and gene networks.

    Science.gov (United States)

    Spanagel, Rainer

    2013-01-01

    Convergent functional genomics (CFG) is a translational methodology that integrates in a Bayesian fashion multiple lines of evidence from studies in human and animal models to get a better understanding of the genetics of a disease or pathological behavior. Here the integration of data sets that derive from forward genetics in animals and genetic association studies including genome wide association studies (GWAS) in humans is described for addictive behavior. The aim of forward genetics in animals and association studies in humans is to identify mutations (e.g. SNPs) that produce a certain phenotype; i.e. "from phenotype to genotype". Most powerful in terms of forward genetics is combined quantitative trait loci (QTL) analysis and gene expression profiling in recombinant inbreed rodent lines or genetically selected animals for a specific phenotype, e.g. high vs. low drug consumption. By Bayesian scoring genomic information from forward genetics in animals is then combined with human GWAS data on a similar addiction-relevant phenotype. This integrative approach generates a robust candidate gene list that has to be functionally validated by means of reverse genetics in animals; i.e. "from genotype to phenotype". It is proposed that studying addiction relevant phenotypes and endophenotypes by this CFG approach will allow a better determination of the genetics of addictive behavior.

  16. Functional renormalization group approach to electronic structure calculations for systems without translational symmetry

    Science.gov (United States)

    Seiler, Christian; Evers, Ferdinand

    2016-10-01

    A formalism for electronic-structure calculations is presented that is based on the functional renormalization group (FRG). The traditional FRG has been formulated for systems that exhibit a translational symmetry with an associated Fermi surface, which can provide the organization principle for the renormalization group (RG) procedure. We here advance an alternative formulation, where the RG flow is organized in the energy-domain rather than in k space. This has the advantage that it can also be applied to inhomogeneous matter lacking a band structure, such as disordered metals or molecules. The energy-domain FRG (ɛ FRG) presented here accounts for Fermi-liquid corrections to quasiparticle energies and particle-hole excitations. It goes beyond the state of the art G W -BSE , because in ɛ FRG the Bethe-Salpeter equation (BSE) is solved in a self-consistent manner. An efficient implementation of the approach that has been tested against exact diagonalization calculations and calculations based on the density matrix renormalization group is presented. Similar to the conventional FRG, also the ɛ FRG is able to signalize the vicinity of an instability of the Fermi-liquid fixed point via runaway flow of the corresponding interaction vertex. Embarking upon this fact, in an application of ɛ FRG to the spinless disordered Hubbard model we calculate its phase boundary in the plane spanned by the interaction and disorder strength. Finally, an extension of the approach to finite temperatures and spin S =1 /2 is also given.

  17. Why Translation Is Difficult

    DEFF Research Database (Denmark)

    Carl, Michael; Schaeffer, Moritz Jonas

    2017-01-01

    The paper develops a definition of translation literality that is based on the syntactic and semantic similarity of the source and the target texts. We provide theoretical and empirical evidence that absolute literal translations are easy to produce. Based on a multilingual corpus of alternative...... translations we investigate the effects of cross-lingual syntactic and semantic distance on translation production times and find that non-literality makes from-scratch translation and post-editing difficult. We show that statistical machine translation systems encounter even more difficulties with non-literality....

  18. Analysis of Arab Abbasid and Modern Turkish Period Translation Activities With the Approach of Andre Lefevere

    OpenAIRE

    M. Zahit Can

    2015-01-01

    Translation on its own is not a happening phenomenon. Beyond doubt there have to be some reasons bringing it to light. These can be expressed in one or more than one generic terms such as geographical, historical, political, commercial, and social reasons. It is possible that within different historical period, geography, culture, religion and interactions of communities there are some commonalities inducing translation. Given that the common work of translation environments implying such div...

  19. Prediction of selective estrogen receptor beta agonist using open data and machine learning approach

    Directory of Open Access Journals (Sweden)

    Niu AQ

    2016-07-01

    Full Text Available Ai-qin Niu,1 Liang-jun Xie,2 Hui Wang,1 Bing Zhu,1 Sheng-qi Wang3 1Department of Gynecology, the First People’s Hospital of Shangqiu, Shangqiu, Henan, People’s Republic of China; 2Department of Image Diagnoses, the Third Hospital of Jinan, Jinan, Shandong, People’s Republic of China; 3Department of Mammary Disease, Guangdong Provincial Hospital of Chinese Medicine, the Second Clinical College of Guangzhou University of Chinese Medicine, Guangzhou, People’s Republic of China Background: Estrogen receptors (ERs are nuclear transcription factors that are involved in the regulation of many complex physiological processes in humans. ERs have been validated as important drug targets for the treatment of various diseases, including breast cancer, ovarian cancer, osteoporosis, and cardiovascular disease. ERs have two subtypes, ER-α and ER-β. Emerging data suggest that the development of subtype-selective ligands that specifically target ER-β could be a more optimal approach to elicit beneficial estrogen-like activities and reduce side effects. Methods: Herein, we focused on ER-β and developed its in silico quantitative structure-activity relationship models using machine learning (ML methods. Results: The chemical structures and ER-β bioactivity data were extracted from public chemogenomics databases. Four types of popular fingerprint generation methods including MACCS fingerprint, PubChem fingerprint, 2D atom pairs, and Chemistry Development Kit extended fingerprint were used as descriptors. Four ML methods including Naïve Bayesian classifier, k-nearest neighbor, random forest, and support vector machine were used to train the models. The range of classification accuracies was 77.10% to 88.34%, and the range of area under the ROC (receiver operating characteristic curve values was 0.8151 to 0.9475, evaluated by the 5-fold cross-validation. Comparison analysis suggests that both the random forest and the support vector machine are superior

  20. Pol II promoter prediction using characteristic 4-mer motifs: a machine learning approach

    Directory of Open Access Journals (Sweden)

    Shoyaib Mohammad

    2008-10-01

    Full Text Available Abstract Background Eukaryotic promoter prediction using computational analysis techniques is one of the most difficult jobs in computational genomics that is essential for constructing and understanding genetic regulatory networks. The increased availability of sequence data for various eukaryotic organisms in recent years has necessitated for better tools and techniques for the prediction and analysis of promoters in eukaryotic sequences. Many promoter prediction methods and tools have been developed to date but they have yet to provide acceptable predictive performance. One obvious criteria to improve on current methods is to devise a better system for selecting appropriate features of promoters that distinguish them from non-promoters. Secondly improved performance can be achieved by enhancing the predictive ability of the machine learning algorithms used. Results In this paper, a novel approach is presented in which 128 4-mer motifs in conjunction with a non-linear machine-learning algorithm utilising a Support Vector Machine (SVM are used to distinguish between promoter and non-promoter DNA sequences. By applying this approach to plant, Drosophila, human, mouse and rat sequences, the classification model has showed 7-fold cross-validation percentage accuracies of 83.81%, 94.82%, 91.25%, 90.77% and 82.35% respectively. The high sensitivity and specificity value of 0.86 and 0.90 for plant; 0.96 and 0.92 for Drosophila; 0.88 and 0.92 for human; 0.78 and 0.84 for mouse and 0.82 and 0.80 for rat demonstrate that this technique is less prone to false positive results and exhibits better performance than many other tools. Moreover, this model successfully identifies location of promoter using TATA weight matrix. Conclusion The high sensitivity and specificity indicate that 4-mer frequencies in conjunction with supervised machine-learning methods can be beneficial in the identification of RNA pol II promoters comparative to other methods. This

  1. Prediction of selective estrogen receptor beta agonist using open data and machine learning approach.

    Science.gov (United States)

    Niu, Ai-Qin; Xie, Liang-Jun; Wang, Hui; Zhu, Bing; Wang, Sheng-Qi

    2016-01-01

    Estrogen receptors (ERs) are nuclear transcription factors that are involved in the regulation of many complex physiological processes in humans. ERs have been validated as important drug targets for the treatment of various diseases, including breast cancer, ovarian cancer, osteoporosis, and cardiovascular disease. ERs have two subtypes, ER-α and ER-β. Emerging data suggest that the development of subtype-selective ligands that specifically target ER-β could be a more optimal approach to elicit beneficial estrogen-like activities and reduce side effects. Herein, we focused on ER-β and developed its in silico quantitative structure-activity relationship models using machine learning (ML) methods. The chemical structures and ER-β bioactivity data were extracted from public chemogenomics databases. Four types of popular fingerprint generation methods including MACCS fingerprint, PubChem fingerprint, 2D atom pairs, and Chemistry Development Kit extended fingerprint were used as descriptors. Four ML methods including Naïve Bayesian classifier, k-nearest neighbor, random forest, and support vector machine were used to train the models. The range of classification accuracies was 77.10% to 88.34%, and the range of area under the ROC (receiver operating characteristic) curve values was 0.8151 to 0.9475, evaluated by the 5-fold cross-validation. Comparison analysis suggests that both the random forest and the support vector machine are superior for the classification of selective ER-β agonists. Chemistry Development Kit extended fingerprints and MACCS fingerprint performed better in structural representation between active and inactive agonists. These results demonstrate that combining the fingerprint and ML approaches leads to robust ER-β agonist prediction models, which are potentially applicable to the identification of selective ER-β agonists.

  2. Soft brain-machine interfaces for assistive robotics: A novel control approach.

    Science.gov (United States)

    Schiatti, Lucia; Tessadori, Jacopo; Barresi, Giacinto; Mattos, Leonardo S; Ajoudani, Arash

    2017-07-01

    Robotic systems offer the possibility of improving the life quality of people with severe motor disabilities, enhancing the individual's degree of independence and interaction with the external environment. In this direction, the operator's residual functions must be exploited for the control of the robot movements and the underlying dynamic interaction through intuitive and effective human-robot interfaces. Towards this end, this work aims at exploring the potential of a novel Soft Brain-Machine Interface (BMI), suitable for dynamic execution of remote manipulation tasks for a wide range of patients. The interface is composed of an eye-tracking system, for an intuitive and reliable control of a robotic arm system's trajectories, and a Brain-Computer Interface (BCI) unit, for the control of the robot Cartesian stiffness, which determines the interaction forces between the robot and environment. The latter control is achieved by estimating in real-time a unidimensional index from user's electroencephalographic (EEG) signals, which provides the probability of a neutral or active state. This estimated state is then translated into a stiffness value for the robotic arm, allowing a reliable modulation of the robot's impedance. A preliminary evaluation of this hybrid interface concept provided evidence on the effective execution of tasks with dynamic uncertainties, demonstrating the great potential of this control method in BMI applications for self-service and clinical care.

  3. Role of Shwachman-Bodian-Diamond syndrome protein in translation machinery and cell chemotaxis: a comparative genomics approach

    Directory of Open Access Journals (Sweden)

    Vasieva O

    2011-09-01

    Full Text Available Olga VasievaInstitute of Integrative Biology, University of Liverpool, Liverpool, United Kingdom; Fellowship for the Interpretation of Genomes, Burr Ridge, IL, USAAbstract: Shwachman-Bodian-Diamond syndrome (SBDS is linked to a mutation in a single gene. The SBDS proinvolved in RNA metabolism and ribosome-associated functions, but SBDS mutation is primarily linked to a defect in polymorphonuclear leukocytes unable to orient correctly in a spatial gradient of chemoattractants. Results of data mining and comparative genomic approaches undertaken in this study suggest that SBDS protein is also linked to tRNA metabolism and translation initiation. Analysis of crosstalk between translation machinery and cytoskeletal dynamics provides new insights into the cellular chemotactic defects caused by SBDS protein malfunction. The proposed functional interactions provide a new approach to exploit potential targets in the treatment and monitoring of this disease.Keywords: Shwachman-Bodian-Diamond syndrome, wybutosine, tRNA, chemotaxis, translation, genomics, gene proximity

  4. A new approach to the solution of the vacuum magnetic problem in fusion machines

    International Nuclear Information System (INIS)

    Zabeo, L.; Artaserse, G.; Cenedese, A.; Piccolo, F.; Sartori, F.

    2007-01-01

    The magnetic vacuum topology reconstruction using magnetic measurements is essential in controlling and understanding plasmas produced in magnetic confinement fusion devices. In a wide range of cases, the instruments used to approach the problem have been designed for a specific machine and to solve a specific plasma model. Recently, a new approach has been used for developing new magnetic software called FELIX. The adopted solution in the design allows the use of the software not only at JET but also at other machines. In order to reduce the analysis and debugging time the software has been designed with modularity and platform independence in mind. This results in a large portability and in particular it allows using the same code both offline and in real-time. One of the main aspects of the tool is its capability to solve different plasma models of current distribution. Thanks to this feature, in order to improve the plasma magnetic reconstruction in real-time, a set of different models has been run using FELIX. FELIX is presently running at JET in different real-time analysis and control systems that need vacuum magnetic topology

  5. Development of Type 2 Diabetes Mellitus Phenotyping Framework Using Expert Knowledge and Machine Learning Approach.

    Science.gov (United States)

    Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko

    2017-07-01

    Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.

  6. Capacity development for knowledge translation: evaluation of an experiential approach through secondment opportunities.

    Science.gov (United States)

    Gerrish, Kate; Piercy, Hilary

    2014-06-01

    Experiential approaches to skills development using secondment models are shown to benefit healthcare organizations more generally, but little is known about the potential of this approach to develop capacity for knowledge translation (KT). To evaluate the success of KT capacity development secondments from the perspective of multiple stakeholders. A pluralistic evaluation design was used. Data were collected during 2011-2012 using focus group and individual interviews with 14 clinical and academic secondees, and five managers from host and seconding organizations to gain insight into participants' perceptions of the success of secondments and the criteria by which they judged success. Six After Action Reviews were undertaken with KT project teams to explore participants' perceptions of the contribution secondees made to KT projects. Semistructured interviews were undertaken with three healthcare managers on completion of projects to explore the impact of secondments on the organization, staff, and patients. Qualitative content analysis was used to identify criteria for success. The criteria provided a framework through which the overall success of secondments could be judged. Six criteria for judging the success of the secondments at individual, team, and organization level were identified: KT skills development, effective workload management, team working, achieving KT objectives, enhanced care delivery, and enhanced education delivery. Benefits to the individual, KT team, seconding, and host organizations were identified. Hosting teams should provide mentorship support to secondees, and be flexible to accommodate secondees' needs as team members. Ongoing support of managers from seconding organizations is needed to maximize the benefits to individual secondees and the organization. Experiential approaches to KT capacity development using secondments can benefit individual secondees, project teams, seconding, and host organizations. © 2014 Sigma Theta Tau

  7. Prediction of biochar yield from cattle manure pyrolysis via least squares support vector machine intelligent approach.

    Science.gov (United States)

    Cao, Hongliang; Xin, Ya; Yuan, Qiaoxia

    2016-02-01

    To predict conveniently the biochar yield from cattle manure pyrolysis, intelligent modeling approach was introduced in this research. A traditional artificial neural networks (ANN) model and a novel least squares support vector machine (LS-SVM) model were developed. For the identification and prediction evaluation of the models, a data set with 33 experimental data was used, which were obtained using a laboratory-scale fixed bed reaction system. The results demonstrated that the intelligent modeling approach is greatly convenient and effective for the prediction of the biochar yield. In particular, the novel LS-SVM model has a more satisfying predicting performance and its robustness is better than the traditional ANN model. The introduction and application of the LS-SVM modeling method gives a successful example, which is a good reference for the modeling study of cattle manure pyrolysis process, even other similar processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach.

    Science.gov (United States)

    Wallace, Byron C; Noel-Storr, Anna; Marshall, Iain J; Cohen, Aaron M; Smalheiser, Neil R; Thomas, James

    2017-11-01

    Identifying all published reports of randomized controlled trials (RCTs) is an important aim, but it requires extensive manual effort to separate RCTs from non-RCTs, even using current machine learning (ML) approaches. We aimed to make this process more efficient via a hybrid approach using both crowdsourcing and ML. We trained a classifier to discriminate between citations that describe RCTs and those that do not. We then adopted a simple strategy of automatically excluding citations deemed very unlikely to be RCTs by the classifier and deferring to crowdworkers otherwise. Combining ML and crowdsourcing provides a highly sensitive RCT identification strategy (our estimates suggest 95%-99% recall) with substantially less effort (we observed a reduction of around 60%-80%) than relying on manual screening alone. Hybrid crowd-ML strategies warrant further exploration for biomedical curation/annotation tasks. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  9. The importance of rat social behavior for translational research : An ethological approach

    NARCIS (Netherlands)

    Peters, S.M.

    2018-01-01

    At present, the preclinical research interest in rodent social behavior is focused on its use as readout parameter in animal models for neuropsychiatric disorders (‘translational research’). However, there are some major limitations that hamper progress. Pivotal is the limited translational value of

  10. Axiomatic Ontology Learning Approaches for English Translation of the Meaning of Quranic Texts

    Directory of Open Access Journals (Sweden)

    Saad Saidah

    2017-01-01

    Full Text Available Ontology learning (OL is the computational task of generating a knowledge base in the form of an ontology, given an unstructured corpus in natural language (NL. While most works in the field of ontology learning have been primarily based on a statistical approach to extract lightweight OL, very few attempts have been made to extract axiomatic OL (called heavyweight OL from NL text documents. Axiomatic OL supports more precise formal logic-based reasoning when compared to lightweight OL. Lexico-syntactic pattern matching and statisticsal one cannot lead to very accurate learning, mostly because of several linguistic nuances in the NL. Axiomatic OL is an alternative methodology that has not been explored much, where a deep linguistics analysis in computational linguistics is used to generate formal axioms and definitions instead of simply inducing a taxonomy. The ontology that is created not only stores the information about the application domain in explicit knowledge, but also can deduce the implicit knowledge from this ontology. This research will explore the English translation of the meaning of Quranic texts.

  11. Identifying Meaning Components in the Translation of Medical Terms from English into Indonesian: A Semantic Approach

    Directory of Open Access Journals (Sweden)

    I Gusti Agung Sri Rwa Jayantini

    2017-10-01

    Full Text Available This paper focuses on identifying meaning components in the translation of English medical terms into Indonesian. The data used in this study are the English medical term disorder and its Indonesian equivalent penyakit (disease. The two terms are purposively chosen as the data of the present study, which is a comparative research on the lexical meaning investigation in two different languages. The investigation involving a particular term in one language and its equivalent in the other language is worth doing since the lexicons in every language have their own specific concepts that may be synonymous, yet they are not always interchangeable in all contexts. The analysis into meaning components is called decomposition by means of several semantic theories to analyse the meaning of a lexical item (Löbner 2013. Here, the meaning components of the two compared terms are demonstrated through a semantic approach, particularly Natural Semantic Metalanguage (NSM supported by the investigation on their synonyms and how the terms are used in different contexts. The results show that the meaning components of a particular term in one language like the English term disorder are not always found in the Indonesian term penyakit, or, conversely, some of the meaning components of the Indonesian term do not always exist in the English term.

  12. Support vector machine based fault detection approach for RFT-30 cyclotron

    Energy Technology Data Exchange (ETDEWEB)

    Kong, Young Bae, E-mail: ybkong@kaeri.re.kr; Lee, Eun Je; Hur, Min Goo; Park, Jeong Hoon; Park, Yong Dae; Yang, Seung Dae

    2016-10-21

    An RFT-30 is a 30 MeV cyclotron used for radioisotope applications and radiopharmaceutical researches. The RFT-30 cyclotron is highly complex and includes many signals for control and monitoring of the system. It is quite difficult to detect and monitor the system failure in real time. Moreover, continuous monitoring of the system is hard and time-consuming work for human operators. In this paper, we propose a support vector machine (SVM) based fault detection approach for the RFT-30 cyclotron. The proposed approach performs SVM learning with training samples to construct the classification model. To compensate the system complexity due to the large-scale accelerator, we utilize the principal component analysis (PCA) for transformation of the original data. After training procedure, the proposed approach detects the system faults in real time. We analyzed the performance of the proposed approach utilizing the experimental data of the RFT-30 cyclotron. The performance results show that the proposed SVM approach can provide an efficient way to control the cyclotron system.

  13. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach.

    Science.gov (United States)

    Gagliardi, Anna R; Légaré, France; Brouwers, Melissa C; Webster, Fiona; Wiljer, David; Badley, Elizabeth; Straus, Sharon

    2011-03-22

    Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes.

  14. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach

    Directory of Open Access Journals (Sweden)

    Wiljer David

    2011-03-01

    Full Text Available Abstract Background Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. Methods A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. Conclusions The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes.

  15. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach

    Science.gov (United States)

    2011-01-01

    Background Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. Methods A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. Conclusions The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes. PMID:21426573

  16. Integrating network, sequence and functional features using machine learning approaches towards identification of novel Alzheimer genes.

    Science.gov (United States)

    Jamal, Salma; Goyal, Sukriti; Shanker, Asheesh; Grover, Abhinav

    2016-10-18

    Alzheimer's disease (AD) is a complex progressive neurodegenerative disorder commonly characterized by short term memory loss. Presently no effective therapeutic treatments exist that can completely cure this disease. The cause of Alzheimer's is still unclear, however one of the other major factors involved in AD pathogenesis are the genetic factors and around 70 % risk of the disease is assumed to be due to the large number of genes involved. Although genetic association studies have revealed a number of potential AD susceptibility genes, there still exists a need for identification of unidentified AD-associated genes and therapeutic targets to have better understanding of the disease-causing mechanisms of Alzheimer's towards development of effective AD therapeutics. In the present study, we have used machine learning approach to identify candidate AD associated genes by integrating topological properties of the genes from the protein-protein interaction networks, sequence features and functional annotations. We also used molecular docking approach and screened already known anti-Alzheimer drugs against the novel predicted probable targets of AD and observed that an investigational drug, AL-108, had high affinity for majority of the possible therapeutic targets. Furthermore, we performed molecular dynamics simulations and MM/GBSA calculations on the docked complexes to validate our preliminary findings. To the best of our knowledge, this is the first comprehensive study of its kind for identification of putative Alzheimer-associated genes using machine learning approaches and we propose that such computational studies can improve our understanding on the core etiology of AD which could lead to the development of effective anti-Alzheimer drugs.

  17. Discovery of Intermetallic Compounds from Traditional to Machine-Learning Approaches.

    Science.gov (United States)

    Oliynyk, Anton O; Mar, Arthur

    2018-01-16

    Intermetallic compounds are bestowed by diverse compositions, complex structures, and useful properties for many materials applications. How metallic elements react to form these compounds and what structures they adopt remain challenging questions that defy predictability. Traditional approaches offer some rational strategies to prepare specific classes of intermetallics, such as targeting members within a modular homologous series, manipulating building blocks to assemble new structures, and filling interstitial sites to create stuffed variants. Because these strategies rely on precedent, they cannot foresee surprising results, by definition. Exploratory synthesis, whether through systematic phase diagram investigations or serendipity, is still essential for expanding our knowledge base. Eventually, the relationships may become too complex for the pattern recognition skills to be reliably or practically performed by humans. Complementing these traditional approaches, new machine-learning approaches may be a viable alternative for materials discovery, not only among intermetallics but also more generally to other chemical compounds. In this Account, we survey our own efforts to discover new intermetallic compounds, encompassing gallides, germanides, phosphides, arsenides, and others. We apply various machine-learning methods (such as support vector machine and random forest algorithms) to confront two significant questions in solid state chemistry. First, what crystal structures are adopted by a compound given an arbitrary composition? Initial efforts have focused on binary equiatomic phases AB, ternary equiatomic phases ABC, and full Heusler phases AB 2 C. Our analysis emphasizes the use of real experimental data and places special value on confirming predictions through experiment. Chemical descriptors are carefully chosen through a rigorous procedure called cluster resolution feature selection. Predictions for crystal structures are quantified by evaluating

  18. Statistical Machine Translation of Japanese

    Science.gov (United States)

    2007-03-01

    hiragana and katakana) syllabaries…………………….. 20 3.2 Sample Japanese sentence showing kanji and kana……………………... 21 3.5 Japanese formality example...syllabary. 19 Figure 3.1. Japanese kana syllabaries, hiragana for native Japanese words, word endings, and particles, and katakana for foreign...Figure 3.2. Simple Japanese sentence showing the use of kanji, hiragana , and katakana. Kanji is used for nouns and verb, adjective, and

  19. Fingerprint-Based Machine Learning Approach to Identify Potent and Selective 5-HT2BR Ligands

    Directory of Open Access Journals (Sweden)

    Krzysztof Rataj

    2018-05-01

    Full Text Available The identification of subtype-selective GPCR (G-protein coupled receptor ligands is a challenging task. In this study, we developed a computational protocol to find compounds with 5-HT2BR versus 5-HT1BR selectivity. Our approach employs the hierarchical combination of machine learning methods, docking, and multiple scoring methods. First, we applied machine learning tools to filter a large database of druglike compounds by the new Neighbouring Substructures Fingerprint (NSFP. This two-dimensional fingerprint contains information on the connectivity of the substructural features of a compound. Preselected subsets of the database were then subjected to docking calculations. The main indicators of compounds’ selectivity were their different interactions with the secondary binding pockets of both target proteins, while binding modes within the orthosteric binding pocket were preserved. The combined methodology of ligand-based and structure-based methods was validated prospectively, resulting in the identification of hits with nanomolar affinity and ten-fold to ten thousand-fold selectivities.

  20. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast.

    Science.gov (United States)

    Poos, Alexandra M; Maicher, André; Dieckmann, Anna K; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-06-02

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. A machine learning approach for automated assessment of retinal vasculature in the oxygen induced retinopathy model.

    Science.gov (United States)

    Mazzaferri, Javier; Larrivée, Bruno; Cakir, Bertan; Sapieha, Przemyslaw; Costantino, Santiago

    2018-03-02

    Preclinical studies of vascular retinal diseases rely on the assessment of developmental dystrophies in the oxygen induced retinopathy rodent model. The quantification of vessel tufts and avascular regions is typically computed manually from flat mounted retinas imaged using fluorescent probes that highlight the vascular network. Such manual measurements are time-consuming and hampered by user variability and bias, thus a rapid and objective method is needed. Here, we introduce a machine learning approach to segment and characterize vascular tufts, delineate the whole vasculature network, and identify and analyze avascular regions. Our quantitative retinal vascular assessment (QuRVA) technique uses a simple machine learning method and morphological analysis to provide reliable computations of vascular density and pathological vascular tuft regions, devoid of user intervention within seconds. We demonstrate the high degree of error and variability of manual segmentations, and designed, coded, and implemented a set of algorithms to perform this task in a fully automated manner. We benchmark and validate the results of our analysis pipeline using the consensus of several manually curated segmentations using commonly used computer tools. The source code of our implementation is released under version 3 of the GNU General Public License ( https://www.mathworks.com/matlabcentral/fileexchange/65699-javimazzaf-qurva ).

  2. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms.

    Science.gov (United States)

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting; Guo, Feng-Biao

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus , which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge.

  3. Application of heuristic and machine-learning approach to engine model calibration

    Science.gov (United States)

    Cheng, Jie; Ryu, Kwang R.; Newman, C. E.; Davis, George C.

    1993-03-01

    Automation of engine model calibration procedures is a very challenging task because (1) the calibration process searches for a goal state in a huge, continuous state space, (2) calibration is often a lengthy and frustrating task because of complicated mutual interference among the target parameters, and (3) the calibration problem is heuristic by nature, and often heuristic knowledge for constraining a search cannot be easily acquired from domain experts. A combined heuristic and machine learning approach has, therefore, been adopted to improve the efficiency of model calibration. We developed an intelligent calibration program called ICALIB. It has been used on a daily basis for engine model applications, and has reduced the time required for model calibrations from many hours to a few minutes on average. In this paper, we describe the heuristic control strategies employed in ICALIB such as a hill-climbing search based on a state distance estimation function, incremental problem solution refinement by using a dynamic tolerance window, and calibration target parameter ordering for guiding the search. In addition, we present the application of a machine learning program called GID3* for automatic acquisition of heuristic rules for ordering target parameters.

  4. Geologic Carbon Sequestration Leakage Detection: A Physics-Guided Machine Learning Approach

    Science.gov (United States)

    Lin, Y.; Harp, D. R.; Chen, B.; Pawar, R.

    2017-12-01

    One of the risks of large-scale geologic carbon sequestration is the potential migration of fluids out of the storage formations. Accurate and fast detection of this fluids migration is not only important but also challenging, due to the large subsurface uncertainty and complex governing physics. Traditional leakage detection and monitoring techniques rely on geophysical observations including pressure. However, the resulting accuracy of these methods is limited because of indirect information they provide requiring expert interpretation, therefore yielding in-accurate estimates of leakage rates and locations. In this work, we develop a novel machine-learning technique based on support vector regression to effectively and efficiently predict the leakage locations and leakage rates based on limited number of pressure observations. Compared to the conventional data-driven approaches, which can be usually seem as a "black box" procedure, we develop a physics-guided machine learning method to incorporate the governing physics into the learning procedure. To validate the performance of our proposed leakage detection method, we employ our method to both 2D and 3D synthetic subsurface models. Our novel CO2 leakage detection method has shown high detection accuracy in the example problems.

  5. Characterization of Adrenal Lesions on Unenhanced MRI Using Texture Analysis: A Machine-Learning Approach.

    Science.gov (United States)

    Romeo, Valeria; Maurea, Simone; Cuocolo, Renato; Petretta, Mario; Mainenti, Pier Paolo; Verde, Francesco; Coppola, Milena; Dell'Aversana, Serena; Brunetti, Arturo

    2018-01-17

    Adrenal adenomas (AA) are the most common benign adrenal lesions, often characterized based on intralesional fat content as either lipid-rich (LRA) or lipid-poor (LPA). The differentiation of AA, particularly LPA, from nonadenoma adrenal lesions (NAL) may be challenging. Texture analysis (TA) can extract quantitative parameters from MR images. Machine learning is a technique for recognizing patterns that can be applied to medical images by identifying the best combination of TA features to create a predictive model for the diagnosis of interest. To assess the diagnostic efficacy of TA-derived parameters extracted from MR images in characterizing LRA, LPA, and NAL using a machine-learning approach. Retrospective, observational study. Sixty MR examinations, including 20 LRA, 20 LPA, and 20 NAL. Unenhanced T 1 -weighted in-phase (IP) and out-of-phase (OP) as well as T 2 -weighted (T 2 -w) MR images acquired at 3T. Adrenal lesions were manually segmented, placing a spherical volume of interest on IP, OP, and T 2 -w images. Different selection methods were trained and tested using the J48 machine-learning classifiers. The feature selection method that obtained the highest diagnostic performance using the J48 classifier was identified; the diagnostic performance was also compared with that of a senior radiologist by means of McNemar's test. A total of 138 TA-derived features were extracted; among these, four features were selected, extracted from the IP (Short_Run_High_Gray_Level_Emphasis), OP (Mean_Intensity and Maximum_3D_Diameter), and T 2 -w (Standard_Deviation) images; the J48 classifier obtained a diagnostic accuracy of 80%. The expert radiologist obtained a diagnostic accuracy of 73%. McNemar's test did not show significant differences in terms of diagnostic performance between the J48 classifier and the expert radiologist. Machine learning conducted on MR TA-derived features is a potential tool to characterize adrenal lesions. 4 Technical Efficacy: Stage 2 J

  6. Global assessment of soil organic carbon stocks and spatial distribution of histosols: the Machine Learning approach

    Science.gov (United States)

    Hengl, Tomislav

    2016-04-01

    Preliminary results of predicting distribution of soil organic soils (Histosols) and soil organic carbon stock (in tonnes per ha) using global compilations of soil profiles (about 150,000 points) and covariates at 250 m spatial resolution (about 150 covariates; mainly MODIS seasonal land products, SRTM DEM derivatives, climatic images, lithological and land cover and landform maps) are presented. We focus on using a data-driven approach i.e. Machine Learning techniques that often require no knowledge about the distribution of the target variable or knowledge about the possible relationships. Other advantages of using machine learning are (DOI: 10.1371/journal.pone.0125814): All rules required to produce outputs are formalized. The whole procedure is documented (the statistical model and associated computer script), enabling reproducible research. Predicted surfaces can make use of various information sources and can be optimized relative to all available quantitative point and covariate data. There is more flexibility in terms of the spatial extent, resolution and support of requested maps. Automated mapping is also more cost-effective: once the system is operational, maintenance and production of updates are an order of magnitude faster and cheaper. Consequently, prediction maps can be updated and improved at shorter and shorter time intervals. Some disadvantages of automated soil mapping based on Machine Learning are: Models are data-driven and any serious blunders or artifacts in the input data can propagate to order-of-magnitude larger errors than in the case of expert-based systems. Fitting machine learning models is at the order of magnitude computationally more demanding. Computing effort can be even tens of thousands higher than if e.g. linear geostatistics is used. Many machine learning models are fairly complex often abstract and any interpretation of such models is not trivial and require special multidimensional / multivariable plotting and data mining

  7. A support vector machine approach to detect financial statement fraud in South Africa: A first look

    CSIR Research Space (South Africa)

    Moepya, SO

    2014-04-01

    Full Text Available Auditors face the difficult task of detecting companies that issue manipulated financial statements. In recent years, machine learning methods have provided a feasible solution to this task. This study develops support vector machine (SVM) models...

  8. Learning Algorithms for Audio and Video Processing: Independent Component Analysis and Support Vector Machine Based Approaches

    National Research Council Canada - National Science Library

    Qi, Yuan

    2000-01-01

    In this thesis, we propose two new machine learning schemes, a subband-based Independent Component Analysis scheme and a hybrid Independent Component Analysis/Support Vector Machine scheme, and apply...

  9. The Application of Machine Learning Algorithms for Text Mining based on Sentiment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Reza Samizade

    2018-06-01

    Full Text Available Classification of the cyber texts and comments into two categories of positive and negative sentiment among social media users is of high importance in the research are related to text mining. In this research, we applied supervised classification methods to classify Persian texts based on sentiment in cyber space. The result of this research is in a form of a system that can decide whether a comment which is published in cyber space such as social networks is considered positive or negative. The comments that are published in Persian movie and movie review websites from 1392 to 1395 are considered as the data set for this research. A part of these data are considered as training and others are considered as testing data. Prior to implementing the algorithms, pre-processing activities such as tokenizing, removing stop words, and n-germs process were applied on the texts. Naïve Bayes, Neural Networks and support vector machine were used for text classification in this study. Out of sample tests showed that there is no evidence indicating that the accuracy of SVM approach is statistically higher than Naïve Bayes or that the accuracy of Naïve Bayes is not statistically higher than NN approach. However, the researchers can conclude that the accuracy of the classification using SVM approach is statistically higher than the accuracy of NN approach in 5% confidence level.

  10. Thermo-energetic design of machine tools a systemic approach to solve the conflict between power efficiency, accuracy and productivity demonstrated at the example of machining production

    CERN Document Server

    2015-01-01

    The approach to the solution within the CRC/TR 96 financed by the German Research Foundation DFG aims at measures that will allow manufacturing accuracy to be maintained under thermally unstable conditions with increased productivity, without an additional demand for energy for tempering. The challenge of research in the CRC/TR 96 derives from the attempt to satisfy the conflicting goals of reducing energy consumption and increasing accuracy and productivity in machining. In the current research performed in 19 subprojects within the scope of the CRC/TR 96, correction and compensation solutions that influence the thermo-elastic machine tool behaviour efficiently and are oriented along the thermo-elastic functional chain are explored and implemented. As part of this general objective, the following issues must be researched and engineered in an interdisciplinary setting and brought together into useful overall solutions:   1.  Providing the modelling fundamentals to calculate the heat fluxes and the resulti...

  11. On Collocations and Their Interaction with Parsing and Translation

    Directory of Open Access Journals (Sweden)

    Violeta Seretan

    2013-10-01

    Full Text Available We address the problem of automatically processing collocations—a subclass of multi-word expressions characterized by a high degree of morphosyntactic flexibility—in the context of two major applications, namely, syntactic parsing and machine translation. We show that parsing and collocation identification are processes that are interrelated and that benefit from each other, inasmuch as syntactic information is crucial for acquiring collocations from corpora and, vice versa, collocational information can be used to improve parsing performance. Similarly, we focus on the interrelation between collocations and machine translation, highlighting the use of translation information for multilingual collocation identification, as well as the use of collocational knowledge for improving translation. We give a panorama of the existing relevant work, and we parallel the literature surveys with our own experiments involving a symbolic parser and a rule-based translation system. The results show a significant improvement over approaches in which the corresponding tasks are decoupled.

  12. Separating depressive comorbidity from panic disorder: A combined functional magnetic resonance imaging and machine learning approach.

    Science.gov (United States)

    Lueken, Ulrike; Straube, Benjamin; Yang, Yunbo; Hahn, Tim; Beesdo-Baum, Katja; Wittchen, Hans-Ulrich; Konrad, Carsten; Ströhle, Andreas; Wittmann, André; Gerlach, Alexander L; Pfleiderer, Bettina; Arolt, Volker; Kircher, Tilo

    2015-09-15

    Depression is frequent in panic disorder (PD); yet, little is known about its influence on the neural substrates of PD. Difficulties in fear inhibition during safety signal processing have been reported as a pathophysiological feature of PD that is attenuated by depression. We investigated the impact of comorbid depression in PD with agoraphobia (AG) on the neural correlates of fear conditioning and the potential of machine learning to predict comorbidity status on the individual patient level based on neural characteristics. Fifty-nine PD/AG patients including 26 (44%) with a comorbid depressive disorder (PD/AG+DEP) underwent functional magnetic resonance imaging (fMRI). Comorbidity status was predicted using a random undersampling tree ensemble in a leave-one-out cross-validation framework. PD/AG-DEP patients showed altered neural activation during safety signal processing, while +DEP patients exhibited generally decreased dorsolateral prefrontal and insular activation. Comorbidity status was correctly predicted in 79% of patients (sensitivity: 73%; specificity: 85%) based on brain activation during fear conditioning (corrected for potential confounders: accuracy: 73%; sensitivity: 77%; specificity: 70%). No primary depressed patients were available; only medication-free patients were included. Major depression and dysthymia were collapsed (power considerations). Neurofunctional activation during safety signal processing differed between patients with or without comorbid depression, a finding which may explain heterogeneous results across previous studies. These findings demonstrate the relevance of comorbidity when investigating neurofunctional substrates of anxiety disorders. Predicting individual comorbidity status may translate neurofunctional data into clinically relevant information which might aid in planning individualized treatment. The study was registered with the ISRCTN80046034. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. A hybrid least squares support vector machines and GMDH approach for river flow forecasting

    Science.gov (United States)

    Samsudin, R.; Saad, P.; Shabri, A.

    2010-06-01

    This paper proposes a novel hybrid forecasting model, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM), known as GLSSVM. The GMDH is used to determine the useful input variables for LSSVM model and the LSSVM model which works as time series forecasting. In this study the application of GLSSVM for monthly river flow forecasting of Selangor and Bernam River are investigated. The results of the proposed GLSSVM approach are compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA) model, GMDH and LSSVM models using the long term observations of monthly river flow discharge. The standard statistical, the root mean square error (RMSE) and coefficient of correlation (R) are employed to evaluate the performance of various models developed. Experiment result indicates that the hybrid model was powerful tools to model discharge time series and can be applied successfully in complex hydrological modeling.

  14. A SUPPORT VECTOR MACHINE APPROACH FOR DEVELOPING TELEMEDICINE SOLUTIONS: MEDICAL DIAGNOSIS

    Directory of Open Access Journals (Sweden)

    Mihaela GHEORGHE

    2015-06-01

    Full Text Available Support vector machine represents an important tool for artificial neural networks techniques including classification and prediction. It offers a solution for a wide range of different issues in which cases the traditional optimization algorithms and methods cannot be applied directly due to different constraints, including memory restrictions, hidden relationships between variables, very high volume of computations that needs to be handled. One of these issues relates to medical diagnosis, a subset of the medical field. In this paper, the SVM learning algorithm is tested on a diabetes dataset and the results obtained for training with different kernel functions are presented and analyzed in order to determine a good approach from a telemedicine perspective.

  15. A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting

    Directory of Open Access Journals (Sweden)

    Zhaoxuan Li

    2016-01-01

    Full Text Available We evaluate and compare two common methods, artificial neural networks (ANN and support vector regression (SVR, for predicting energy productions from a solar photovoltaic (PV system in Florida 15 min, 1 h and 24 h ahead of time. A hierarchical approach is proposed based on the machine learning algorithms tested. The production data used in this work corresponds to 15 min averaged power measurements collected from 2014. The accuracy of the model is determined using computing error statistics such as mean bias error (MBE, mean absolute error (MAE, root mean square error (RMSE, relative MBE (rMBE, mean percentage error (MPE and relative RMSE (rRMSE. This work provides findings on how forecasts from individual inverters will improve the total solar power generation forecast of the PV system.

  16. A Hybrid dasymetric and machine learning approach to high-resolution residential electricity consumption modeling

    Energy Technology Data Exchange (ETDEWEB)

    Morton, April M [ORNL; Nagle, Nicholas N [ORNL; Piburn, Jesse O [ORNL; Stewart, Robert N [ORNL; McManamay, Ryan A [ORNL

    2017-01-01

    As urban areas continue to grow and evolve in a world of increasing environmental awareness, the need for detailed information regarding residential energy consumption patterns has become increasingly important. Though current modeling efforts mark significant progress in the effort to better understand the spatial distribution of energy consumption, the majority of techniques are highly dependent on region-specific data sources and often require building- or dwelling-level details that are not publicly available for many regions in the United States. Furthermore, many existing methods do not account for errors in input data sources and may not accurately reflect inherent uncertainties in model outputs. We propose an alternative and more general hybrid approach to high-resolution residential electricity consumption modeling by merging a dasymetric model with a complementary machine learning algorithm. The method s flexible data requirement and statistical framework ensure that the model both is applicable to a wide range of regions and considers errors in input data sources.

  17. Prediction of outcome in internet-delivered cognitive behaviour therapy for paediatric obsessive-compulsive disorder: A machine learning approach.

    Science.gov (United States)

    Lenhard, Fabian; Sauer, Sebastian; Andersson, Erik; Månsson, Kristoffer Nt; Mataix-Cols, David; Rück, Christian; Serlachius, Eva

    2018-03-01

    There are no consistent predictors of treatment outcome in paediatric obsessive-compulsive disorder (OCD). One reason for this might be the use of suboptimal statistical methodology. Machine learning is an approach to efficiently analyse complex data. Machine learning has been widely used within other fields, but has rarely been tested in the prediction of paediatric mental health treatment outcomes. To test four different machine learning methods in the prediction of treatment response in a sample of paediatric OCD patients who had received Internet-delivered cognitive behaviour therapy (ICBT). Participants were 61 adolescents (12-17 years) who enrolled in a randomized controlled trial and received ICBT. All clinical baseline variables were used to predict strictly defined treatment response status three months after ICBT. Four machine learning algorithms were implemented. For comparison, we also employed a traditional logistic regression approach. Multivariate logistic regression could not detect any significant predictors. In contrast, all four machine learning algorithms performed well in the prediction of treatment response, with 75 to 83% accuracy. The results suggest that machine learning algorithms can successfully be applied to predict paediatric OCD treatment outcome. Validation studies and studies in other disorders are warranted. Copyright © 2017 John Wiley & Sons, Ltd.

  18. A comparative analysis of machine learning approaches for plant disease identification

    Directory of Open Access Journals (Sweden)

    Hidayat ur Rahman

    2017-08-01

    Full Text Available Background: The problems to leaf in plants are very severe and they usually shorten the lifespan of plants. Leaf diseases are mainly caused due to three types of attacks including viral, bacterial or fungal. Diseased leaves reduce the crop production and affect the agricultural economy. Since agriculture plays a vital role in the economy, thus effective mechanism is required to detect the problem in early stages. Methods: Traditional approaches used for the identification of diseased plants are based on field visits which is time consuming and tedious. In this paper a comparative analysis of machine learning approaches has been presented for the identification of healthy and non-healthy plant leaves. For experimental purpose three different types of plant leaves have been selected namely, cabbage, citrus and sorghum. In order to classify healthy and non-healthy plant leaves color based features such as pixels, statistical features such as mean, standard deviation, min, max and descriptors such as Histogram of Oriented Gradients (HOG have been used. Results: 382 images of cabbage, 539 images of citrus and 262 images of sorghum were used as the primary dataset. The 40% data was utilized for testing and 60% were used for training which consisted of both healthy and damaged leaves. The results showed that random forest classifier is the best machine method for classification of healthy and diseased plant leaves. Conclusion: From the extensive experimentation it is concluded that features such as color information, statistical distribution and histogram of gradients provides sufficient clue for the classification of healthy and non-healthy plants.

  19. Technical and policy approaches to balancing patient privacy and data sharing in clinical and translational research.

    Science.gov (United States)

    Malin, Bradley; Karp, David; Scheuermann, Richard H

    2010-01-01

    Clinical researchers need to share data to support scientific validation and information reuse and to comply with a host of regulations and directives from funders. Various organizations are constructing informatics resources in the form of centralized databases to ensure reuse of data derived from sponsored research. The widespread use of such open databases is contingent on the protection of patient privacy. We review privacy-related problems associated with data sharing for clinical research from technical and policy perspectives. We investigate existing policies for secondary data sharing and privacy requirements in the context of data derived from research and clinical settings. In particular, we focus on policies specified by the US National Institutes of Health and the Health Insurance Portability and Accountability Act and touch on how these policies are related to current and future use of data stored in public database archives. We address aspects of data privacy and identifiability from a technical, although approachable, perspective and summarize how biomedical databanks can be exploited and seemingly anonymous records can be reidentified using various resources without hacking into secure computer systems. We highlight which clinical and translational data features, specified in emerging research models, are potentially vulnerable or exploitable. In the process, we recount a recent privacy-related concern associated with the publication of aggregate statistics from pooled genome-wide association studies that have had a significant impact on the data sharing policies of National Institutes of Health-sponsored databanks. Based on our analysis and observations we provide a list of recommendations that cover various technical, legal, and policy mechanisms that open clinical databases can adopt to strengthen data privacy protection as they move toward wider deployment and adoption.

  20. Behavioral profiling as a translational approach in an animal model of posttraumatic stress disorder.

    Science.gov (United States)

    Ardi, Ziv; Albrecht, Anne; Richter-Levin, Alon; Saha, Rinki; Richter-Levin, Gal

    2016-04-01

    Diagnosis of psychiatric disorders in humans is based on comparing individuals to the normal population. However, many animal models analyze averaged group effects, thus compromising their translational power. This discrepancy is particularly relevant in posttraumatic stress disorder (PTSD), where only a minority develop the disorder following a traumatic experience. In our PTSD rat model, we utilize a novel behavioral profiling approach that allows the classification of affected and unaffected individuals in a trauma-exposed population. Rats were exposed to underwater trauma (UWT) and four weeks later their individual performances in the open field and elevated plus maze were compared to those of the control group, allowing the identification of affected and resilient UWT-exposed rats. Behavioral profiling revealed that only a subset of the UWT-exposed rats developed long-lasting behavioral symptoms. The proportion of affected rats was further enhanced by pre-exposure to juvenile stress, a well-described risk factor of PTSD. For a biochemical proof of concept we analyzed the expression levels of the GABAA receptor subunits α1 and α2 in the ventral, dorsal hippocampus and basolateral amygdala. Increased expression, mainly of α1, was observed in ventral but not dorsal hippocampus of exposed animals, which would traditionally be interpreted as being associated with the exposure-resultant psychopathology. However, behavioral profiling revealed that this increased expression was confined to exposed-unaffected individuals, suggesting a resilience-associated expression regulation. The results provide evidence for the importance of employing behavioral profiling in animal models of PTSD, in order to better understand the neural basis of stress vulnerability and resilience. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Downscaling of MODIS One Kilometer Evapotranspiration Using Landsat-8 Data and Machine Learning Approaches

    Directory of Open Access Journals (Sweden)

    Yinghai Ke

    2016-03-01

    Full Text Available This study presented a MODIS 8-day 1 km evapotranspiration (ET downscaling method based on Landsat 8 data (30 m and machine learning approaches. Eleven indicators including albedo, land surface temperature (LST, and vegetation indices (VIs derived from Landsat 8 data were first upscaled to 1 km resolution. Machine learning algorithms including Support Vector Regression (SVR, Cubist, and Random Forest (RF were used to model the relationship between the Landsat indicators and MODIS 8-day 1 km ET. The models were then used to predict 30 m ET based on Landsat 8 indicators. A total of thirty-two pairs of Landsat 8 images/MODIS ET data were evaluated at four study sites including two in United States and two in South Korea. Among the three models, RF produced the lowest error, with relative Root Mean Square Error (rRMSE less than 20%. Vegetation greenness related indicators such as Normalized Difference Vegetation Index (NDVI, Enhanced Vegetation Index (EVI, Soil Adjusted Vegetation Index (SAVI, and vegetation moisture related indicators such as Normalized Difference Infrared Index—Landsat 8 OLI band 7 (NDIIb7 and Normalized Difference Water Index (NDWI were the five most important features used in RF model. Temperature-based indicators were less important than vegetation greenness and moisture-related indicators because LST could have considerable variation during each 8-day period. The predicted Landsat downscaled ET had good overall agreement with MODIS ET (average rRMSE = 22% and showed a similar temporal trend as MODIS ET. Compared to the MODIS ET product, the downscaled product demonstrated more spatial details, and had better agreement with in situ ET observations (R2 = 0.56. However, we found that the accuracy of MODIS ET was the main control factor of the accuracy of the downscaled product. Improved coarse-resolution ET estimation would result in better finer-resolution estimation. This study proved the potential of using machine learning

  2. A new approach to the solution of the vacuum magnetic problem in fusion machines

    International Nuclear Information System (INIS)

    Zabeo, L.; Piccolo, F.; Sartori, F.; Albanese, R.; Cenedese, A.

    2006-01-01

    The magnetic vacuum topology reconstruction using the magnetic measurements is essential in controlling and understanding plasmas produced by fusion machines. In a wide range of the cases, the instruments to approach the problem have been designed for a specific machine and to solve a specific plasma model. Recently a new approach has been used by developing new magnetic software called Felix. The adopted solution in the design allows the use of the software not only at JET but also at different machines by simply changing a configuration file. A database describing the tokamak in the magnetic point of view is used to provide different vacuum magnetic models (polynomial, moments, filamentary) that can be solved by Felix without any recompiling or testing. In order to reduce the analysis and debugging time the software has been designed with modularity and platform independence in mind. That results in a large portability and in particular it allows use of the same code both offline and in real-time. One of the main aspects of the tool is its capability to solve different plasma models of current distribution by changing its configuration file. In order to improve the plasma magnetic reconstruction in real time a set of models has been run using Felix. An improved polynomial based model compared with the one presently used and two models using current filaments have been tested and compared. The new system has also been improved the calculation of plasma magnetic parameters. Double null configurations smooth transitions, more accurate gap and strike-point calculations, detailed boundary reconstruction are now systematically available. Felix is presently running at JET in different real-time analysis and control systems that need vacuum magnetic topology such as control of the plasma shape, the wall protection system [F.Piccolo et al.'Upgrade of the protection system for the first wall at JET in the ITER Be and W tiles prespective' this conference], the magnetic

  3. A hybrid Taguchi-artificial neural network approach to predict surface roughness during electric discharge machining of titanium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Sanjeev; Batish, Ajay [Thapar University, Patiala (India); Singh, Rupinder [GNDEC, Ludhiana (India); Singh, T. P. [Symbiosis Institute of Technology, Pune (India)

    2014-07-15

    In the present study, electric discharge machining process was used for machining of titanium alloys. Eight process parameters were varied during the process. Experimental results showed that current and pulse-on-time significantly affected the performance characteristics. Artificial neural network coupled with Taguchi approach was applied for optimization and prediction of surface roughness. The experimental results and the predicted results showed good agreement. SEM was used to investigate the surface integrity. Analysis for migration of different chemical elements and formation of compounds on the surface was performed using EDS and XRD pattern. The results showed that high discharge energy caused surface defects such as cracks, craters, thick recast layer, micro pores, pin holes, residual stresses and debris. Also, migration of chemical elements both from electrode and dielectric media were observed during EDS analysis. Presence of carbon was seen on the machined surface. XRD results showed formation of titanium carbide compound which precipitated on the machined surface.

  4. A novel artificial bee colony approach of live virtual machine migration policy using Bayes theorem.

    Science.gov (United States)

    Xu, Gaochao; Ding, Yan; Zhao, Jia; Hu, Liang; Fu, Xiaodong

    2013-01-01

    Green cloud data center has become a research hotspot of virtualized cloud computing architecture. Since live virtual machine (VM) migration technology is widely used and studied in cloud computing, we have focused on the VM placement selection of live migration for power saving. We present a novel heuristic approach which is called PS-ABC. Its algorithm includes two parts. One is that it combines the artificial bee colony (ABC) idea with the uniform random initialization idea, the binary search idea, and Boltzmann selection policy to achieve an improved ABC-based approach with better global exploration's ability and local exploitation's ability. The other one is that it uses the Bayes theorem to further optimize the improved ABC-based process to faster get the final optimal solution. As a result, the whole approach achieves a longer-term efficient optimization for power saving. The experimental results demonstrate that PS-ABC evidently reduces the total incremental power consumption and better protects the performance of VM running and migrating compared with the existing research. It makes the result of live VM migration more high-effective and meaningful.

  5. A Novel Artificial Bee Colony Approach of Live Virtual Machine Migration Policy Using Bayes Theorem

    Directory of Open Access Journals (Sweden)

    Gaochao Xu

    2013-01-01

    Full Text Available Green cloud data center has become a research hotspot of virtualized cloud computing architecture. Since live virtual machine (VM migration technology is widely used and studied in cloud computing, we have focused on the VM placement selection of live migration for power saving. We present a novel heuristic approach which is called PS-ABC. Its algorithm includes two parts. One is that it combines the artificial bee colony (ABC idea with the uniform random initialization idea, the binary search idea, and Boltzmann selection policy to achieve an improved ABC-based approach with better global exploration’s ability and local exploitation’s ability. The other one is that it uses the Bayes theorem to further optimize the improved ABC-based process to faster get the final optimal solution. As a result, the whole approach achieves a longer-term efficient optimization for power saving. The experimental results demonstrate that PS-ABC evidently reduces the total incremental power consumption and better protects the performance of VM running and migrating compared with the existing research. It makes the result of live VM migration more high-effective and meaningful.

  6. Building a protein name dictionary from full text: a machine learning term extraction approach

    Directory of Open Access Journals (Sweden)

    Campagne Fabien

    2005-04-01

    Full Text Available Abstract Background The majority of information in the biological literature resides in full text articles, instead of abstracts. Yet, abstracts remain the focus of many publicly available literature data mining tools. Most literature mining tools rely on pre-existing lexicons of biological names, often extracted from curated gene or protein databases. This is a limitation, because such databases have low coverage of the many name variants which are used to refer to biological entities in the literature. Results We present an approach to recognize named entities in full text. The approach collects high frequency terms in an article, and uses support vector machines (SVM to identify biological entity names. It is also computationally efficient and robust to noise commonly found in full text material. We use the method to create a protein name dictionary from a set of 80,528 full text articles. Only 8.3% of the names in this dictionary match SwissProt description lines. We assess the quality of the dictionary by studying its protein name recognition performance in full text. Conclusion This dictionary term lookup method compares favourably to other published methods, supporting the significance of our direct extraction approach. The method is strong in recognizing name variants not found in SwissProt.

  7. Employability and Related Context Prediction Framework for University Graduands: A Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Manushi P. Wijayapala

    2016-12-01

    Full Text Available In Sri Lanka (SL, graduands’ employability remains a national issue due to the increasing number of graduates produced by higher education institutions each year. Thus, predicting the employability of university graduands can mitigate this issue since graduands can identify what qualifications or skills they need to strengthen up in order to find a job of their desired field with a good salary, before they complete the degree. The main objective of the study is to discover the plausibility of applying machine learning approach efficiently and effectively towards predicting the employability and related context of university graduands in Sri Lanka by proposing an architectural framework which consists of four modules; employment status prediction, job salary prediction, job field prediction and job relevance prediction of graduands while also comparing performance of classification algorithms under each prediction module. Series of machine learning algorithms such as C4.5, Naïve Bayes and AODE have been experimented on the Graduand Employment Census - 2014 data. A pre-processing step is proposed to overcome challenges embedded in graduand employability data and a feature selection process is proposed in order to reduce computational complexity. Additionally, parameter tuning is also done to get the most optimized parameters. More importantly, this study utilizes several types of Sampling (Oversampling, Undersampling and Ensemble (Bagging, Boosting, RF techniques as well as a newly proposed hybrid approach to overcome the limitations caused by the class imbalance phenomena. For the validation purposes, a wide range of evaluation measures was used to analyze the effectiveness of applying classification algorithms and class imbalance mitigation techniques on the dataset. The experimented results indicated that RandomForest has recorded the highest classification performance for 3 modules, achieving the selected best predictive models under hybrid

  8. Medical subdomain classification of clinical notes using a machine learning-based natural language processing approach.

    Science.gov (United States)

    Weng, Wei-Hung; Wagholikar, Kavishwar B; McCray, Alexa T; Szolovits, Peter; Chueh, Henry C

    2017-12-01

    The medical subdomain of a clinical note, such as cardiology or neurology, is useful content-derived metadata for developing machine learning downstream applications. To classify the medical subdomain of a note accurately, we have constructed a machine learning-based natural language processing (NLP) pipeline and developed medical subdomain classifiers based on the content of the note. We constructed the pipeline using the clinical NLP system, clinical Text Analysis and Knowledge Extraction System (cTAKES), the Unified Medical Language System (UMLS) Metathesaurus, Semantic Network, and learning algorithms to extract features from two datasets - clinical notes from Integrating Data for Analysis, Anonymization, and Sharing (iDASH) data repository (n = 431) and Massachusetts General Hospital (MGH) (n = 91,237), and built medical subdomain classifiers with different combinations of data representation methods and supervised learning algorithms. We evaluated the performance of classifiers and their portability across the two datasets. The convolutional recurrent neural network with neural word embeddings trained-medical subdomain classifier yielded the best performance measurement on iDASH and MGH datasets with area under receiver operating characteristic curve (AUC) of 0.975 and 0.991, and F1 scores of 0.845 and 0.870, respectively. Considering better clinical interpretability, linear support vector machine-trained medical subdomain classifier using hybrid bag-of-words and clinically relevant UMLS concepts as the feature representation, with term frequency-inverse document frequency (tf-idf)-weighting, outperformed other shallow learning classifiers on iDASH and MGH datasets with AUC of 0.957 and 0.964, and F1 scores of 0.932 and 0.934 respectively. We trained classifiers on one dataset, applied to the other dataset and yielded the threshold of F1 score of 0.7 in classifiers for half of the medical subdomains we studied. Our study shows that a supervised

  9. Parenthetical Cohesive Explicitness: A Linguistic Approach for a Modified Translation of the Quranic Text

    Directory of Open Access Journals (Sweden)

    Mohammad Amin Hawamdeh

    2015-09-01

    Full Text Available Motivated by the severe criticism the Hilali and Khan (HK Translation of the Holy Quran has received for its too many parenthetical insertions, this study aimed at linguistically realizing how such added pieces of information could be for necessary cohesive explicitness or worthless redundant interpolation. Methodically, the HK translation of the first 8 verses of Chapter 18 (The Cave, Surah Al Kahf of the Holy Quran was selected to be a subject material. A number of 15 instances of explicitation put in parentheses were encountered; they were found to be based upon 23 cohesive (grammatical/lexical relationships and, hence, to be considered as ones of cohesive explicitness. Eventually, such an analysis could be of use for modifying the available translations of the Holy Quran.

  10. Genome-scale identification of Legionella pneumophila effectors using a machine learning approach.

    Directory of Open Access Journals (Sweden)

    David Burstein

    2009-07-01

    Full Text Available A large number of highly pathogenic bacteria utilize secretion systems to translocate effector proteins into host cells. Using these effectors, the bacteria subvert host cell processes during infection. Legionella pneumophila translocates effectors via the Icm/Dot type-IV secretion system and to date, approximately 100 effectors have been identified by various experimental and computational techniques. Effector identification is a critical first step towards the understanding of the pathogenesis system in L. pneumophila as well as in other bacterial pathogens. Here, we formulate the task of effector identification as a classification problem: each L. pneumophila open reading frame (ORF was classified as either effector or not. We computationally defined a set of features that best distinguish effectors from non-effectors. These features cover a wide range of characteristics including taxonomical dispersion, regulatory data, genomic organization, similarity to eukaryotic proteomes and more. Machine learning algorithms utilizing these features were then applied to classify all the ORFs within the L. pneumophila genome. Using this approach we were able to predict and experimentally validate 40 new effectors, reaching a success rate of above 90%. Increasing the number of validated effectors to around 140, we were able to gain novel insights into their characteristics. Effectors were found to have low G+C content, supporting the hypothesis that a large number of effectors originate via horizontal gene transfer, probably from their protozoan host. In addition, effectors were found to cluster in specific genomic regions. Finally, we were able to provide a novel description of the C-terminal translocation signal required for effector translocation by the Icm/Dot secretion system. To conclude, we have discovered 40 novel L. pneumophila effectors, predicted over a hundred additional highly probable effectors, and shown the applicability of machine

  11. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  12. Electric machines

    CERN Document Server

    Gross, Charles A

    2006-01-01

    BASIC ELECTROMAGNETIC CONCEPTSBasic Magnetic ConceptsMagnetically Linear Systems: Magnetic CircuitsVoltage, Current, and Magnetic Field InteractionsMagnetic Properties of MaterialsNonlinear Magnetic Circuit AnalysisPermanent MagnetsSuperconducting MagnetsThe Fundamental Translational EM MachineThe Fundamental Rotational EM MachineMultiwinding EM SystemsLeakage FluxThe Concept of Ratings in EM SystemsSummaryProblemsTRANSFORMERSThe Ideal n-Winding TransformerTransformer Ratings and Per-Unit ScalingThe Nonideal Three-Winding TransformerThe Nonideal Two-Winding TransformerTransformer Efficiency and Voltage RegulationPractical ConsiderationsThe AutotransformerOperation of Transformers in Three-Phase EnvironmentsSequence Circuit Models for Three-Phase Transformer AnalysisHarmonics in TransformersSummaryProblemsBASIC MECHANICAL CONSIDERATIONSSome General PerspectivesEfficiencyLoad Torque-Speed CharacteristicsMass Polar Moment of InertiaGearingOperating ModesTranslational SystemsA Comprehensive Example: The ElevatorP...

  13. A Hybrid Supervised/Unsupervised Machine Learning Approach to Solar Flare Prediction

    Science.gov (United States)

    Benvenuto, Federico; Piana, Michele; Campi, Cristina; Massone, Anna Maria

    2018-01-01

    This paper introduces a novel method for flare forecasting, combining prediction accuracy with the ability to identify the most relevant predictive variables. This result is obtained by means of a two-step approach: first, a supervised regularization method for regression, namely, LASSO is applied, where a sparsity-enhancing penalty term allows the identification of the significance with which each data feature contributes to the prediction; then, an unsupervised fuzzy clustering technique for classification, namely, Fuzzy C-Means, is applied, where the regression outcome is partitioned through the minimization of a cost function and without focusing on the optimization of a specific skill score. This approach is therefore hybrid, since it combines supervised and unsupervised learning; realizes classification in an automatic, skill-score-independent way; and provides effective prediction performances even in the case of imbalanced data sets. Its prediction power is verified against NOAA Space Weather Prediction Center data, using as a test set, data in the range between 1996 August and 2010 December and as training set, data in the range between 1988 December and 1996 June. To validate the method, we computed several skill scores typically utilized in flare prediction and compared the values provided by the hybrid approach with the ones provided by several standard (non-hybrid) machine learning methods. The results showed that the hybrid approach performs classification better than all other supervised methods and with an effectiveness comparable to the one of clustering methods; but, in addition, it provides a reliable ranking of the weights with which the data properties contribute to the forecast.

  14. Predictive Maintenance of Power Substation Equipment by Infrared Thermography Using a Machine-Learning Approach

    Directory of Open Access Journals (Sweden)

    Irfan Ullah

    2017-12-01

    Full Text Available A variety of reasons, specifically contact issues, irregular loads, cracks in insulation, defective relays, terminal junctions and other similar issues, increase the internal temperature of electrical instruments. This results in unexpected disturbances and potential damage to power equipment. Therefore, the initial prevention measures of thermal anomalies in electrical tools are essential to prevent power-equipment failure. In this article, we address this initial prevention mechanism for power substations using a computer-vision approach by taking advantage of infrared thermal images. The thermal images are taken through infrared cameras without disturbing the working operations of power substations. Thus, this article augments the non-destructive approach to defect analysis in electrical power equipment using computer vision and machine learning. We use a total of 150 thermal pictures of different electrical equipment in 10 different substations in operating conditions, using 300 different hotspots. Our approach uses multi-layered perceptron (MLP to classify the thermal conditions of components of power substations into “defect” and “non-defect” classes. A total of eleven features, which are first-order and second-order statistical features, are calculated from the thermal sample images. The performance of MLP shows initial accuracy of 79.78%. We further augment the MLP with graph cut to increase accuracy to 84%. We argue that with the successful development and deployment of this new system, the Technology Department of Chongqing can arrange the recommended actions and thus save cost in repair and outages. This can play an important role in the quick and reliable inspection to potentially prevent power substation equipment from failure, which will save the whole system from breakdown. The increased 84% accuracy with the integration of the graph cut shows the efficacy of the proposed defect analysis approach.

  15. A novel featureless approach to mass detection in digital mammograms based on support vector machines

    Energy Technology Data Exchange (ETDEWEB)

    Campanini, Renato [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Dongiovanni, Danilo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Iampieri, Emiro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Lanconelli, Nico [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Masotti, Matteo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Palermo, Giuseppe [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Riccardi, Alessandro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Roffilli, Matteo [Department of Computer Science, University of Bologna, Bologna (Italy)

    2004-03-21

    In this work, we present a novel approach to mass detection in digital mammograms. The great variability of the appearance of masses is the main obstacle to building a mass detection method. It is indeed demanding to characterize all the varieties of masses with a reduced set of features. Hence, in our approach we have chosen not to extract any feature, for the detection of the region of interest; in contrast, we exploit all the information available on the image. A multiresolution overcomplete wavelet representation is performed, in order to codify the image with redundancy of information. The vectors of the very-large space obtained are then provided to a first support vector machine (SVM) classifier. The detection task is considered here as a two-class pattern recognition problem: crops are classified as suspect or not, by using this SVM classifier. False candidates are eliminated with a second cascaded SVM. To further reduce the number of false positives, an ensemble of experts is applied: the final suspect regions are achieved by using a voting strategy. The sensitivity of the presented system is nearly 80% with a false-positive rate of 1.1 marks per image, estimated on images coming from the USF DDSM database.

  16. CoSpa: A Co-training Approach for Spam Review Identification with Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Wen Zhang

    2016-03-01

    Full Text Available Spam reviews are increasingly appearing on the Internet to promote sales or defame competitors by misleading consumers with deceptive opinions. This paper proposes a co-training approach called CoSpa (Co-training for Spam review identification to identify spam reviews by two views: one is the lexical terms derived from the textual content of the reviews and the other is the PCFG (Probabilistic Context-Free Grammars rules derived from a deep syntax analysis of the reviews. Using SVM (Support Vector Machine as the base classifier, we develop two strategies, CoSpa-C and CoSpa-U, embedded within the CoSpa approach. The CoSpa-C strategy selects unlabeled reviews classified with the largest confidence to augment the training dataset to retrain the classifier. The CoSpa-U strategy randomly selects unlabeled reviews with a uniform distribution of confidence. Experiments on the spam dataset and the deception dataset demonstrate that both the proposed CoSpa algorithms outperform the traditional SVM with lexical terms and PCFG rules in spam review identification. Moreover, the CoSpa-U strategy outperforms the CoSpa-C strategy when we use the absolute value of decision function of SVM as the confidence.

  17. How long will my mouse live? Machine learning approaches for prediction of mouse life span.

    Science.gov (United States)

    Swindell, William R; Harper, James M; Miller, Richard A

    2008-09-01

    Prediction of individual life span based on characteristics evaluated at middle-age represents a challenging objective for aging research. In this study, we used machine learning algorithms to construct models that predict life span in a stock of genetically heterogeneous mice. Life-span prediction accuracy of 22 algorithms was evaluated using a cross-validation approach, in which models were trained and tested with distinct subsets of data. Using a combination of body weight and T-cell subset measures evaluated before 2 years of age, we show that the life-span quartile to which an individual mouse belongs can be predicted with an accuracy of 35.3% (+/-0.10%). This result provides a new benchmark for the development of life-span-predictive models, but improvement can be expected through identification of new predictor variables and development of computational approaches. Future work in this direction can provide tools for aging research and will shed light on associations between phenotypic traits and longevity.

  18. Comparison of machine learned approaches for thyroid nodule characterization from shear wave elastography images

    Science.gov (United States)

    Pereira, Carina; Dighe, Manjiri; Alessio, Adam M.

    2018-02-01

    Various Computer Aided Diagnosis (CAD) systems have been developed that characterize thyroid nodules using the features extracted from the B-mode ultrasound images and Shear Wave Elastography images (SWE). These features, however, are not perfect predictors of malignancy. In other domains, deep learning techniques such as Convolutional Neural Networks (CNNs) have outperformed conventional feature extraction based machine learning approaches. In general, fully trained CNNs require substantial volumes of data, motivating several efforts to use transfer learning with pre-trained CNNs. In this context, we sought to compare the performance of conventional feature extraction, fully trained CNNs, and transfer learning based, pre-trained CNNs for the detection of thyroid malignancy from ultrasound images. We compared these approaches applied to a data set of 964 B-mode and SWE images from 165 patients. The data were divided into 80% training/validation and 20% testing data. The highest accuracies achieved on the testing data for the conventional feature extraction, fully trained CNN, and pre-trained CNN were 0.80, 0.75, and 0.83 respectively. In this application, classification using a pre-trained network yielded the best performance, potentially due to the relatively limited sample size and sub-optimal architecture for the fully trained CNN.

  19. Machine learning approaches to decipher hormone and HER2 receptor status phenotypes in breast cancer.

    Science.gov (United States)

    Adabor, Emmanuel S; Acquaah-Mensah, George K

    2017-10-16

    Breast cancer prognosis and administration of therapies are aided by knowledge of hormonal and HER2 receptor status. Breast cancer lacking estrogen receptors, progesterone receptors and HER2 receptors are difficult to treat. Regarding large data repositories such as The Cancer Genome Atlas, available wet-lab methods for establishing the presence of these receptors do not always conclusively cover all available samples. To this end, we introduce median-supplement methods to identify hormonal and HER2 receptor status phenotypes of breast cancer patients using gene expression profiles. In these approaches, supplementary instances based on median patient gene expression are introduced to balance a training set from which we build simple models to identify the receptor expression status of patients. In addition, for the purpose of benchmarking, we examine major machine learning approaches that are also applicable to the problem of finding receptor status in breast cancer. We show that our methods are robust and have high sensitivity with extremely low false-positive rates compared with the well-established methods. A successful application of these methods will permit the simultaneous study of large collections of samples of breast cancer patients as well as save time and cost while standardizing interpretation of outcomes of such studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Classifying injury narratives of large administrative databases for surveillance-A practical approach combining machine learning ensembles and human review.

    Science.gov (United States)

    Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R

    2017-01-01

    Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as

  1. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data.

    Science.gov (United States)

    Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam

    2016-01-01

    The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and

  2. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data

    Directory of Open Access Journals (Sweden)

    Mitchell Pesesky

    2016-11-01

    Full Text Available The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitate initial use of empiric (frequently broad-spectrum antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0% and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance

  3. An Approach to Analysing the Quality of Menu Translations in Southern Spain Restaurants

    Science.gov (United States)

    Fuentes-Luque, Adrián

    2017-01-01

    Cuisine and restaurants are powerful tools for cultural, social and tourist image-building, and projection for tourist promotion, particularly in the case of major tourist places and destinations which boast a well-deserved, long-standing history and reputation for gastronomical beacons. When menus are not properly translated (or transcreated)…

  4. 78 FR 12764 - Draft Office of Health Assessment and Translation Approach for Systematic Review and Evidence...

    Science.gov (United States)

    2013-02-25

    ... study will evaluate the association of bisphenol A (BPA) exposure with obesity and the other will... Health Assessments--February 2013; Request for Comments; Notice of a Meeting SUMMARY: The National Toxicology Program (NTP) requests public comments on the Draft Office of Health Assessment and Translation...

  5. A Hands-On Approach to Teaching Protein Translation & Translocation into the ER

    Science.gov (United States)

    LaBonte, Michelle L.

    2013-01-01

    The process of protein translation and translocation into the endoplasmic reticulum (ER) can often be challenging for introductory college biology students to visualize. To help them understand how proteins become oriented in the ER membrane, I developed a hands-on activity in which students use Play-Doh to simulate the process of protein…

  6. A practical approach to translating social cultural patterns into new design

    NARCIS (Netherlands)

    Mulder-Nijkamp, Maaike; Garde, Julia Anne

    2010-01-01

    People buy products to express their desired identity and therefore prefer products that fit their own personality. The personality of products is created by implicit and explicit design features. However the translation of implicit and explicit design characteristics into new designs is difficult

  7. Curriculum Assessment Using Artificial Neural Network and Support Vector Machine Modeling Approaches: A Case Study. IR Applications. Volume 29

    Science.gov (United States)

    Chen, Chau-Kuang

    2010-01-01

    Artificial Neural Network (ANN) and Support Vector Machine (SVM) approaches have been on the cutting edge of science and technology for pattern recognition and data classification. In the ANN model, classification accuracy can be achieved by using the feed-forward of inputs, back-propagation of errors, and the adjustment of connection weights. In…

  8. Machine learning approaches to analyze histological images of tissues from radical prostatectomies.

    Science.gov (United States)

    Gertych, Arkadiusz; Ing, Nathan; Ma, Zhaoxuan; Fuchs, Thomas J; Salman, Sadri; Mohanty, Sambit; Bhele, Sanica; Velásquez-Vacca, Adriana; Amin, Mahul B; Knudsen, Beatrice S

    2015-12-01

    Computerized evaluation of histological preparations of prostate tissues involves identification of tissue components such as stroma (ST), benign/normal epithelium (BN) and prostate cancer (PCa). Image classification approaches have been developed to identify and classify glandular regions in digital images of prostate tissues; however their success has been limited by difficulties in cellular segmentation and tissue heterogeneity. We hypothesized that utilizing image pixels to generate intensity histograms of hematoxylin (H) and eosin (E) stains deconvoluted from H&E images numerically captures the architectural difference between glands and stroma. In addition, we postulated that joint histograms of local binary patterns and local variance (LBPxVAR) can be used as sensitive textural features to differentiate benign/normal tissue from cancer. Here we utilized a machine learning approach comprising of a support vector machine (SVM) followed by a random forest (RF) classifier to digitally stratify prostate tissue into ST, BN and PCa areas. Two pathologists manually annotated 210 images of low- and high-grade tumors from slides that were selected from 20 radical prostatectomies and digitized at high-resolution. The 210 images were split into the training (n=19) and test (n=191) sets. Local intensity histograms of H and E were used to train a SVM classifier to separate ST from epithelium (BN+PCa). The performance of SVM prediction was evaluated by measuring the accuracy of delineating epithelial areas. The Jaccard J=59.5 ± 14.6 and Rand Ri=62.0 ± 7.5 indices reported a significantly better prediction when compared to a reference method (Chen et al., Clinical Proteomics 2013, 10:18) based on the averaged values from the test set. To distinguish BN from PCa we trained a RF classifier with LBPxVAR and local intensity histograms and obtained separate performance values for BN and PCa: JBN=35.2 ± 24.9, OBN=49.6 ± 32, JPCa=49.5 ± 18.5, OPCa=72.7 ± 14.8 and Ri=60.6

  9. Machine learning approaches to supporting the identification of photoreceptor-enriched genes based on expression data

    Directory of Open Access Journals (Sweden)

    Simpson David

    2006-03-01

    Full Text Available Abstract Background Retinal photoreceptors are highly specialised cells, which detect light and are central to mammalian vision. Many retinal diseases occur as a result of inherited dysfunction of the rod and cone photoreceptor cells. Development and maintenance of photoreceptors requires appropriate regulation of the many genes specifically or highly expressed in these cells. Over the last decades, different experimental approaches have been developed to identify photoreceptor enriched genes. Recent progress in RNA analysis technology has generated large amounts of gene expression data relevant to retinal development. This paper assesses a machine learning methodology for supporting the identification of photoreceptor enriched genes based on expression data. Results Based on the analysis of publicly-available gene expression data from the developing mouse retina generated by serial analysis of gene expression (SAGE, this paper presents a predictive methodology comprising several in silico models for detecting key complex features and relationships encoded in the data, which may be useful to distinguish genes in terms of their functional roles. In order to understand temporal patterns of photoreceptor gene expression during retinal development, a two-way cluster analysis was firstly performed. By clustering SAGE libraries, a hierarchical tree reflecting relationships between developmental stages was obtained. By clustering SAGE tags, a more comprehensive expression profile for photoreceptor cells was revealed. To demonstrate the usefulness of machine learning-based models in predicting functional associations from the SAGE data, three supervised classification models were compared. The results indicated that a relatively simple instance-based model (KStar model performed significantly better than relatively more complex algorithms, e.g. neural networks. To deal with the problem of functional class imbalance occurring in the dataset, two data re

  10. Translator-computer interaction in action

    DEFF Research Database (Denmark)

    Bundgaard, Kristine; Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    perspective, this paper investigates the relationship between machines and humans in the field of translation, analysing a CAT process in which machine-translation (MT) technology was integrated into a translation-memory (TM) suite. After a review of empirical research into the impact of CAT tools......Though we lack empirically-based knowledge of the impact of computer-aided translation (CAT) tools on translation processes, it is generally agreed that all professional translators are now involved in some kind of translator-computer interaction (TCI), using O’Brien’s (2012) term. Taking a TCI......, the study indicates that the tool helps the translator conform to project and customer requirements....

  11. From translation to enactment: contributions of the Actor-Network Theory to the processual approach to organizations

    Directory of Open Access Journals (Sweden)

    Patricia Kinast De Camillis

    Full Text Available Abstract In the area of Administration, especially in the Organizational Studies (OS, the Actor-Network Theory (ANT has been regarded as part of a movement that aims to leave the functional emphasis of organization and pursue the study of process and practices of organizing - the processual approach to organizations. However, criticism to the ANT has led some authors to seek to overcome them through analytical twists concerning certain concepts. One of these "twists" involved the concept of translation and the inclusion of the concept of enactment . This article discusses both notions with the aid of two studies developed having these concepts as a basis, in order to indicate that the choice of enactment brings along a processual view different from that observed in translation. The concept of translation addresses the predominant and it emphasizes understanding how networks of relationships and objects become "stable"; in turn, enact works with multiplicity and fluidity, where the process takes precedence over things. Although the proposed term enactment does not seek to directly face all criticism, it contributes so that ANT does not take a neutral or mechanical view in its analyses and descriptions. Enactment has the view of organization as a result and product of continuous process and it allows understanding that this is not just working or not (success or failure, but it concerns the "production" of multiple realities when we conduct research in Administration having the processual approach to organizations as a basis.

  12. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    Science.gov (United States)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  13. Machine Learning Approach for Prediction and Understanding of Glass-Forming Ability.

    Science.gov (United States)

    Sun, Y T; Bai, H Y; Li, M Z; Wang, W H

    2017-07-20

    The prediction of the glass-forming ability (GFA) by varying the composition of alloys is a challenging problem in glass physics, as well as a problem for industry, with enormous financial ramifications. Although different empirical guides for the prediction of GFA were established over decades, a comprehensive model or approach that is able to deal with as many variables as possible simultaneously for efficiently predicting good glass formers is still highly desirable. Here, by applying the support vector classification method, we develop models for predicting the GFA of binary metallic alloys from random compositions. The effect of different input descriptors on GFA were evaluated, and the best prediction model was selected, which shows that the information related to liquidus temperatures plays a key role in the GFA of alloys. On the basis of this model, good glass formers can be predicted with high efficiency. The prediction efficiency can be further enhanced by improving larger database and refined input descriptor selection. Our findings suggest that machine learning is very powerful and efficient and has great potential for discovering new metallic glasses with good GFA.

  14. Promises of Machine Learning Approaches in Prediction of Absorption of Compounds.

    Science.gov (United States)

    Kumar, Rajnish; Sharma, Anju; Siddiqui, Mohammed Haris; Tiwari, Rajesh Kumar

    2018-01-01

    The Machine Learning (ML) is one of the fastest developing techniques in the prediction and evaluation of important pharmacokinetic properties such as absorption, distribution, metabolism and excretion. The availability of a large number of robust validation techniques for prediction models devoted to pharmacokinetics has significantly enhanced the trust and authenticity in ML approaches. There is a series of prediction models generated and used for rapid screening of compounds on the basis of absorption in last one decade. Prediction of absorption of compounds using ML models has great potential across the pharmaceutical industry as a non-animal alternative to predict absorption. However, these prediction models still have to go far ahead to develop the confidence similar to conventional experimental methods for estimation of drug absorption. Some of the general concerns are selection of appropriate ML methods and validation techniques in addition to selecting relevant descriptors and authentic data sets for the generation of prediction models. The current review explores published models of ML for the prediction of absorption using physicochemical properties as descriptors and their important conclusions. In addition, some critical challenges in acceptance of ML models for absorption are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. A machine learning approach to automated structural network analysis: application to neonatal encephalopathy.

    Directory of Open Access Journals (Sweden)

    Etay Ziv

    Full Text Available Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity.

  16. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  17. A Machine Learning Approach for Using the Postmortem Skin Microbiome to Estimate the Postmortem Interval.

    Directory of Open Access Journals (Sweden)

    Hunter R Johnson

    Full Text Available Research on the human microbiome, the microbiota that live in, on, and around the human person, has revolutionized our understanding of the complex interactions between microbial life and human health and disease. The microbiome may also provide a valuable tool in forensic death investigations by helping to reveal the postmortem interval (PMI of a decedent that is discovered after an unknown amount of time since death. Current methods of estimating PMI for cadavers discovered in uncontrolled, unstudied environments have substantial limitations, some of which may be overcome through the use of microbial indicators. In this project, we sampled the microbiomes of decomposing human cadavers, focusing on the skin microbiota found in the nasal and ear canals. We then developed several models of statistical regression to establish an algorithm for predicting the PMI of microbial samples. We found that the complete data set, rather than a curated list of indicator species, was preferred for training the regressor. We further found that genus and family, rather than species, are the most informative taxonomic levels. Finally, we developed a k-nearest- neighbor regressor, tuned with the entire data set from all nasal and ear samples, that predicts the PMI of unknown samples with an average error of ±55 accumulated degree days (ADD. This study outlines a machine learning approach for the use of necrobiome data in the prediction of the PMI and thereby provides a successful proof-of- concept that skin microbiota is a promising tool in forensic death investigations.

  18. A data-driven predictive approach for drug delivery using machine learning techniques.

    Directory of Open Access Journals (Sweden)

    Yuanyuan Li

    Full Text Available In drug delivery, there is often a trade-off between effective killing of the pathogen, and harmful side effects associated with the treatment. Due to the difficulty in testing every dosing scenario experimentally, a computational approach will be helpful to assist with the prediction of effective drug delivery methods. In this paper, we have developed a data-driven predictive system, using machine learning techniques, to determine, in silico, the effectiveness of drug dosing. The system framework is scalable, autonomous, robust, and has the ability to predict the effectiveness of the current drug treatment and the subsequent drug-pathogen dynamics. The system consists of a dynamic model incorporating both the drug concentration and pathogen population into distinct states. These states are then analyzed using a temporal model to describe the drug-cell interactions over time. The dynamic drug-cell interactions are learned in an adaptive fashion and used to make sequential predictions on the effectiveness of the dosing strategy. Incorporated into the system is the ability to adjust the sensitivity and specificity of the learned models based on a threshold level determined by the operator for the specific application. As a proof-of-concept, the system was validated experimentally using the pathogen Giardia lamblia and the drug metronidazole in vitro.

  19. Remote sensing-based measurement of Living Environment Deprivation: Improving classical approaches with machine learning.

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    Full Text Available This paper provides evidence on the usefulness of very high spatial resolution (VHR imagery in gathering socioeconomic information in urban settlements. We use land cover, spectral, structure and texture features extracted from a Google Earth image of Liverpool (UK to evaluate their potential to predict Living Environment Deprivation at a small statistical area level. We also contribute to the methodological literature on the estimation of socioeconomic indices with remote-sensing data by introducing elements from modern machine learning. In addition to classical approaches such as Ordinary Least Squares (OLS regression and a spatial lag model, we explore the potential of the Gradient Boost Regressor and Random Forests to improve predictive performance and accuracy. In addition to novel predicting methods, we also introduce tools for model interpretation and evaluation such as feature importance and partial dependence plots, or cross-validation. Our results show that Random Forest proved to be the best model with an R2 of around 0.54, followed by Gradient Boost Regressor with 0.5. Both the spatial lag model and the OLS fall behind with significantly lower performances of 0.43 and 0.3, respectively.

  20. A support vector machine approach to the automatic identification of fluorescence spectra emitted by biological agents

    Science.gov (United States)

    Gelfusa, M.; Murari, A.; Lungaroni, M.; Malizia, A.; Parracino, S.; Peluso, E.; Cenciarelli, O.; Carestia, M.; Pizzoferrato, R.; Vega, J.; Gaudio, P.

    2016-10-01

    Two of the major new concerns of modern societies are biosecurity and biosafety. Several biological agents (BAs) such as toxins, bacteria, viruses, fungi and parasites are able to cause damage to living systems either humans, animals or plants. Optical techniques, in particular LIght Detection And Ranging (LIDAR), based on the transmission of laser pulses and analysis of the return signals, can be successfully applied to monitoring the release of biological agents into the atmosphere. It is well known that most of biological agents tend to emit specific fluorescence spectra, which in principle allow their detection and identification, if excited by light of the appropriate wavelength. For these reasons, the detection of the UVLight Induced Fluorescence (UV-LIF) emitted by BAs is particularly promising. On the other hand, the stand-off detection of BAs poses a series of challenging issues; one of the most severe is the automatic discrimination between various agents which emit very similar fluorescence spectra. In this paper, a new data analysis method, based on a combination of advanced filtering techniques and Support Vector Machines, is described. The proposed approach covers all the aspects of the data analysis process, from filtering and denoising to automatic recognition of the agents. A systematic series of numerical tests has been performed to assess the potential and limits of the proposed methodology. The first investigations of experimental data have already given very encouraging results.

  1. A microscopy approach for in situ inspection of micro-coordinate measurement machine styli for contamination

    Science.gov (United States)

    Feng, Xiaobing; Pascal, Jonathan; Lawes, Simon

    2017-09-01

    During the process of measurement using a micro-coordinate measurement machine (µCMM) contamination gradually builds up on the surface of the stylus tip and affects the dimensional accuracy of the measurement. Regular inspection of the stylus for contamination is essential to determine the appropriate cleaning interval and prevent the dimensional error from becoming significant. However, in situ inspection of a µCMM stylus is challenging due to the size, spherical shape, material and surface properties of a typical stylus. To address this challenge, this study evaluates several non-contact measurement technologies for in situ stylus inspection and, based on those findings, proposes a cost-effective microscopy approach. The operational principle is then demonstrated by an automated prototype, coordinated directly by the CMM software MCOSMOS, with an effective threshold of detection as low as 400 nm and a large field of view and depth of field. The level of contamination on the stylus has been found to increase steadily with the number of measurement contacts made. Once excessive contamination is detected on the stylus, measurement should be stopped and a stylus cleaning procedure should be performed to avoid affecting measurement accuracy.

  2. A microscopy approach for in situ inspection of micro-coordinate measurement machine styli for contamination

    International Nuclear Information System (INIS)

    Feng, Xiaobing; Lawes, Simon; Pascal, Jonathan

    2017-01-01

    During the process of measurement using a micro-coordinate measurement machine (µCMM) contamination gradually builds up on the surface of the stylus tip and affects the dimensional accuracy of the measurement. Regular inspection of the stylus for contamination is essential to determine the appropriate cleaning interval and prevent the dimensional error from becoming significant. However, in situ inspection of a µCMM stylus is challenging due to the size, spherical shape, material and surface properties of a typical stylus. To address this challenge, this study evaluates several non-contact measurement technologies for in situ stylus inspection and, based on those findings, proposes a cost-effective microscopy approach. The operational principle is then demonstrated by an automated prototype, coordinated directly by the CMM software MCOSMOS, with an effective threshold of detection as low as 400 nm and a large field of view and depth of field. The level of contamination on the stylus has been found to increase steadily with the number of measurement contacts made. Once excessive contamination is detected on the stylus, measurement should be stopped and a stylus cleaning procedure should be performed to avoid affecting measurement accuracy. (paper)

  3. Translational approaches to understanding metabolic dysfunction and cardiovascular consequences of obstructive sleep apnea

    Science.gov (United States)

    Polotsky, Vsevolod Y.; O'Donnell, Christopher P.; Cravo, Sergio L.; Lorenzi-Filho, Geraldo; Machado, Benedito H.

    2015-01-01

    Obstructive sleep apnea (OSA) is known to be independently associated with several cardiovascular diseases including hypertension, myocardial infarction, and stroke. To determine how OSA can increase cardiovascular risk, animal models have been developed to explore the underlying mechanisms and the cellular and end-organ targets of the predominant pathophysiological disturbance in OSA–intermittent hypoxia. Despite several limitations in translating data from animal models to the clinical arena, significant progress has been made in our understanding of how OSA confers increased cardiovascular risk. It is clear now that the hypoxic stress associated with OSA can elicit a broad spectrum of pathological systemic events including sympathetic activation, systemic inflammation, impaired glucose and lipid metabolism, and endothelial dysfunction, among others. This review provides an update of the basic, clinical, and translational advances in our understanding of the metabolic dysfunction and cardiovascular consequences of OSA and highlights the most recent findings and perspectives in the field. PMID:26232233

  4. EXPLICITATION AND ADDITION TECHNIQUES IN AUDIOVISUAL TRANSLATION: A MULTIMODAL APPROACH OF ENGLISHINDONESIAN SUBTITLES

    Directory of Open Access Journals (Sweden)

    Ichwan Suyudi

    2017-12-01

    Full Text Available In audiovisual translation, the multimodality of the audiovisual text is both a challenge and a resource for subtitlers. This paper illustrates how multi-modes provide information that helps subtitlers to gain a better understanding of meaning-making practices that will influence them to make a decision-making in translating a certain verbal text. Subtitlers may explicit, add, and condense the texts based on the multi-modes as seen on the visual frames. Subtitlers have to consider the distribution and integration of the meanings of multi-modes in order to create comprehensive equivalence between the source and target texts. Excerpts of visual frames in this paper are taken from English films Forrest Gump (drama, 1996, and James Bond (thriller, 2010.

  5. Developing Evidence for Public Health Policy and Practice: The Implementation of a Knowledge Translation Approach in a Staged, Multi-Methods Study in England, 2007-09

    Science.gov (United States)

    South, Jane; Cattan, Mima

    2014-01-01

    Effective knowledge translation processes are critical for the development of evidence-based public health policy and practice. This paper reports on the design and implementation of an innovative approach to knowledge translation within a mixed methods study on lay involvement in public health programme delivery. The study design drew on…

  6. Molecular imaging of prostate cancer: translating molecular biology approaches into the clinical realm.

    Science.gov (United States)

    Vargas, Hebert Alberto; Grimm, Jan; F Donati, Olivio; Sala, Evis; Hricak, Hedvig

    2015-05-01

    The epidemiology of prostate cancer has dramatically changed since the introduction of prostate-specific antigen (PSA) screening in the 1980's. Most prostate cancers today are detected at early stages of the disease and are considered 'indolent'; however, some patients' prostate cancers demonstrate a more aggressive behaviour which leads to rapid progression and death. Increasing understanding of the biology underlying the heterogeneity that characterises this disease has led to a continuously evolving role of imaging in the management of prostate cancer. Functional and metabolic imaging techniques are gaining importance as the impact on the therapeutic paradigm has shifted from structural tumour detection alone to distinguishing patients with indolent tumours that can be managed conservatively (e.g., by active surveillance) from patients with more aggressive tumours that may require definitive treatment with surgery or radiation. In this review, we discuss advanced imaging techniques that allow direct visualisation of molecular interactions relevant to prostate cancer and their potential for translation to the clinical setting in the near future. The potential use of imaging to follow molecular events during drug therapy as well as the use of imaging agents for therapeutic purposes will also be discussed. • Advanced imaging techniques allow direct visualisation of molecular interactions in prostate cancer. • MRI/PET, optical and Cerenkov imaging facilitate the translation of molecular biology. • Multiple compounds targeting PSMA expression are currently undergoing clinical translation. • Other targets (e.g., PSA, prostate-stem cell antigen, GRPR) are in development.

  7. Chapter 16: text mining for translational bioinformatics.

    Science.gov (United States)

    Cohen, K Bretonnel; Hunter, Lawrence E

    2013-04-01

    Text mining for translational bioinformatics is a new field with tremendous research potential. It is a subfield of biomedical natural language processing that concerns itself directly with the problem of relating basic biomedical research to clinical practice, and vice versa. Applications of text mining fall both into the category of T1 translational research-translating basic science results into new interventions-and T2 translational research, or translational research for public health. Potential use cases include better phenotyping of research subjects, and pharmacogenomic research. A variety of methods for evaluating text mining applications exist, including corpora, structured test suites, and post hoc judging. Two basic principles of linguistic structure are relevant for building text mining applications. One is that linguistic structure consists of multiple levels. The other is that every level of linguistic structure is characterized by ambiguity. There are two basic approaches to text mining: rule-based, also known as knowledge-based; and machine-learning-based, also known as statistical. Many systems are hybrids of the two approaches. Shared tasks have had a strong effect on the direction of the field. Like all translational bioinformatics software, text mining software for translational bioinformatics can be considered health-critical and should be subject to the strictest standards of quality assurance and software testing.

  8. Improved Membership Probability for Moving Groups: Bayesian and Machine Learning Approaches

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-01-01

    Gravitationally unbound loose stellar associations (i.e., young nearby moving groups: moving groups hereafter) have been intensively explored because they are important in planet and disk formation studies, exoplanet imaging, and age calibration. Among the many efforts devoted to the search for moving group members, a Bayesian approach (e.g.,using the code BANYAN) has become popular recently because of the many advantages it offers. However, the resultant membership probability needs to be carefully adopted because of its sensitive dependence on input models. In this study, we have developed an improved membership calculation tool focusing on the beta-Pic moving group. We made three improvements for building models used in BANYAN II: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZUVW. Our improved tool can change membership probability up to 70%. Membership probability is critical and must be better defined. For example, our code identifies only one third of the candidate members in SIMBAD that are believed to be kinematically associated with beta-Pic moving group.Additionally, we performed cluster analysis of young nearby stars using an unsupervised machine learning approach. As more moving groups and their members are identified, the complexity and ambiguity in moving group configuration has been increased. To clarify this issue, we analyzed ~4,000 X-ray bright young stellar candidates. Here, we present the preliminary results. By re-identifying moving groups with the least human intervention, we expect to understand the composition of the solar neighborhood. Moreover better defined moving group membership will help us understand star formation and evolution in relatively low density environments; especially for the low-mass stars which will be identified in the coming Gaia release.

  9. Other programmatic agencies in the metropolis: a machinic approach to urban reterritorialization processes

    Directory of Open Access Journals (Sweden)

    Igor Guatelli

    2013-06-01

    Full Text Available What if the strength of the architectural object were associated with program and spatial strategies engendered at the service of “habitability” and future sociabilities rather than with the building of monumental architectural gadgets and optical events in the landscape? Based on the Deleuzean (from the philosopher Gilles Deleuze machinic phylum as well as concepts associated with it such as “bonding” and “agency,” using the Lacanian approach (from the psychiatrist Jacques Lacan to the gadget concept and the Derridian concept (from the philosopher Jacques Derrida of “supplement,” this article discusses a shift of the most current senses and representations of contemporary urban architectural design historically associated with the notable (meaning the wish to be noticed formal and composite materialization of the artistic object at the service of programmed sociabilities towards nother conceptualization. The building of architectural supports from residual (according to Deleuze, the possibility of producing other wishes, far from the dominant capitalist logic, lies in residues in the residual flows produced by the capital itself programmatic and spatial agencies emerges as a critical path to the categorical imperative of the generalizing global logic. It is a logic based on non-territorial landscapes and centered on investments in the composite view and intentional spatial and programmatic imprisonments in familiar formulae originating from domesticated and standardized prêt-à-utiliser thinking. To think about other architectural spatial and programmatic agencies originating from residues and flows that simultaneously rise from and escape the global logic is to bet on the chance of non-programmed sociabilities taking place. Ceasing to think about architecture as a formal object in its artistic and paradigmatic dimension would mean to conceive it as an urban syntagmatic machine of [de]constructive power

  10. TargetSpy: a supervised machine learning approach for microRNA target prediction.

    Science.gov (United States)

    Sturm, Martin; Hackenberg, Michael; Langenberger, David; Frishman, Dmitrij

    2010-05-28

    Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences.In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila

  11. TargetSpy: a supervised machine learning approach for microRNA target prediction

    Directory of Open Access Journals (Sweden)

    Langenberger David

    2010-05-01

    Full Text Available Abstract Background Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. Results We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences. In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I no seed match requirement, II seed match requirement, and III conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Conclusion Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on

  12. Computational prediction of multidisciplinary team decision-making for adjuvant breast cancer drug therapies: a machine learning approach.

    Science.gov (United States)

    Lin, Frank P Y; Pokorny, Adrian; Teng, Christina; Dear, Rachel; Epstein, Richard J

    2016-12-01

    Multidisciplinary team (MDT) meetings are used to optimise expert decision-making about treatment options, but such expertise is not digitally transferable between centres. To help standardise medical decision-making, we developed a machine learning model designed to predict MDT decisions about adjuvant breast cancer treatments. We analysed MDT decisions regarding adjuvant systemic therapy for 1065 breast cancer cases over eight years. Machine learning classifiers with and without bootstrap aggregation were correlated with MDT decisions (recommended, not recommended, or discussable) regarding adjuvant cytotoxic, endocrine and biologic/targeted therapies, then tested for predictability using stratified ten-fold cross-validations. The predictions so derived were duly compared with those based on published (ESMO and NCCN) cancer guidelines. Machine learning more accurately predicted adjuvant chemotherapy MDT decisions than did simple application of guidelines. No differences were found between MDT- vs. ESMO/NCCN- based decisions to prescribe either adjuvant endocrine (97%, p = 0.44/0.74) or biologic/targeted therapies (98%, p = 0.82/0.59). In contrast, significant discrepancies were evident between MDT- and guideline-based decisions to prescribe chemotherapy (87%, p machine learning models. A machine learning approach based on clinicopathologic characteristics can predict MDT decisions about adjuvant breast cancer drug therapies. The discrepancy between MDT- and guideline-based decisions regarding adjuvant chemotherapy implies that certain non-clincopathologic criteria, such as patient preference and resource availability, are factored into clinical decision-making by local experts but not captured by guidelines.

  13. Neuroimaging mechanisms of change in psychotherapy for addictive behaviors: emerging translational approaches that bridge biology and behavior.

    Science.gov (United States)

    Feldstein Ewing, Sarah W; Chung, Tammy

    2013-06-01

    Research on mechanisms of behavior change provides an innovative method to improve treatment for addictive behaviors. An important extension of mechanisms of change research involves the use of translational approaches, which examine how basic biological (i.e., brain-based mechanisms) and behavioral factors interact in initiating and sustaining positive behavior change as a result of psychotherapy. Articles in this special issue include integrative conceptual reviews and innovative empirical research on brain-based mechanisms that may underlie risk for addictive behaviors and response to psychotherapy from adolescence through adulthood. Review articles discuss hypothesized mechanisms of change for cognitive and behavioral therapies, mindfulness-based interventions, and neuroeconomic approaches. Empirical articles cover a range of addictive behaviors, including use of alcohol, cigarettes, marijuana, cocaine, and pathological gambling and represent a variety of imaging approaches including fMRI, magneto-encephalography, real-time fMRI, and diffusion tensor imaging. Additionally, a few empirical studies directly examine brain-based mechanisms of change, whereas others examine brain-based indicators as predictors of treatment outcome. Finally, two commentaries discuss craving as a core feature of addiction, and the importance of a developmental approach to examining mechanisms of change. Ultimately, translational research on mechanisms of behavior change holds promise for increasing understanding of how psychotherapy may modify brain structure and functioning and facilitate the initiation and maintenance of positive treatment outcomes for addictive behaviors. 2013 APA, all rights reserved

  14. Smart Cutting Tools and Smart Machining: Development Approaches, and Their Implementation and Application Perspectives

    Science.gov (United States)

    Cheng, Kai; Niu, Zhi-Chao; Wang, Robin C.; Rakowski, Richard; Bateman, Richard

    2017-09-01

    Smart machining has tremendous potential and is becoming one of new generation high value precision manufacturing technologies in line with the advance of Industry 4.0 concepts. This paper presents some innovative design concepts and, in particular, the development of four types of smart cutting tools, including a force-based smart cutting tool, a temperature-based internally-cooled cutting tool, a fast tool servo (FTS) and smart collets for ultraprecision and micro manufacturing purposes. Implementation and application perspectives of these smart cutting tools are explored and discussed particularly for smart machining against a number of industrial application requirements. They are contamination-free machining, machining of tool-wear-prone Si-based infra-red devices and medical applications, high speed micro milling and micro drilling, etc. Furthermore, implementation techniques are presented focusing on: (a) plug-and-produce design principle and the associated smart control algorithms, (b) piezoelectric film and surface acoustic wave transducers to measure cutting forces in process, (c) critical cutting temperature control in real-time machining, (d) in-process calibration through machining trials, (e) FE-based design and analysis of smart cutting tools, and (f) application exemplars on adaptive smart machining.

  15. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    frequency and the related frequencies as orders of the fundamental frequency. When analyzing rotating or reciprocating machines it is important to know the running speed. Usually this requires direct access to the rotating parts in order to mount a dedicated tachometer probe. In this thesis different......Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...

  16. Efficient Machine Learning Approach for Optimizing Scientific Computing Applications on Emerging HPC Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Kamesh [Old Dominion Univ., Norfolk, VA (United States)

    2017-05-01

    the parallel implementation challenges of such irregular applications on different HPC architectures. In particular, we use supervised learning to predict the computation structure and use it to address the control-ow and memory access irregularities in the parallel implementation of such applications on GPUs, Xeon Phis, and heterogeneous architectures composed of multi-core CPUs with GPUs or Xeon Phis. We use numerical simulation of charged particles beam dynamics simulation as a motivating example throughout the dissertation to present our new approach, though they should be equally applicable to a wide range of irregular applications. The machine learning approach presented here use predictive analytics and forecasting techniques to adaptively model and track the irregular memory access pattern at each time step of the simulation to anticipate the future memory access pattern. Access pattern forecasts can then be used to formulate optimization decisions during application execution which improves the performance of the application at a future time step based on the observations from earlier time steps. In heterogeneous architectures, forecasts can also be used to improve the memory performance and resource utilization of all the processing units to deliver a good aggregate performance. We used these optimization techniques and anticipation strategy to design a cache-aware, memory efficient parallel algorithm to address the irregularities in the parallel implementation of charged particles beam dynamics simulation on different HPC architectures. Experimental result using a diverse mix of HPC architectures shows that our approach in using anticipation strategy is effective in maximizing data reuse, ensuring workload balance, minimizing branch and memory divergence, and in improving resource utilization.

  17. Poster abstract: A machine learning approach for vehicle classification using passive infrared and ultrasonic sensors

    KAUST Repository

    Warriach, Ehsan Ullah; Claudel, Christian G.

    2013-01-01

    This article describes the implementation of four different machine learning techniques for vehicle classification in a dual ultrasonic/passive infrared traffic flow sensors. Using k-NN, Naive Bayes, SVM and KNN-SVM algorithms, we show that KNN

  18. A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment

    Science.gov (United States)

    Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong

    Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.

  19. The art and science of rotating field machines design a practical approach

    CERN Document Server

    Ostović, Vlado

    2017-01-01

    This book highlights procedures utilized by the design departments of leading global manufacturers, offering readers essential insights into the electromagnetic and thermal design of rotating field (induction and synchronous) electric machines. Further, it details the physics of the key phenomena involved in the machines’ operation, conducts a thorough analysis and synthesis of polyphase windings, and presents the tools and methods used in the evaluation of winding performance. The book develops and solves the machines’ magnetic circuits, and determines their electromagnetic forces and torques. Special attention is paid to thermal problems in electrical machines, along with fluid flow computations. With a clear emphasis on the practical aspects of electric machine design and synthesis, the author applies his nearly 40 years of professional experience with electric machine manufacturers – both as an employee and consultant – to provide readers with the tools they need to determine fluid flow parameters...

  20. Defining difficult laryngoscopy findings by using multiple parameters: A machine learning approach

    Directory of Open Access Journals (Sweden)

    Moustafa Abdelaziz Moustafa

    2017-04-01

    Conclusion: “Alex Difficult Laryngoscopy Software” (ADLS is a machine learning program for prediction of difficult laryngoscopy. New cases can be entered to the training set thus improving the accuracy of the software.

  1. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    Directory of Open Access Journals (Sweden)

    Qiaokang Liang

    2016-11-01

    Full Text Available Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  2. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    Science.gov (United States)

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  3. Medical subdomain classification of clinical notes using a machine learning-based natural language processing approach

    OpenAIRE

    Weng, Wei-Hung; Wagholikar, Kavishwar B.; McCray, Alexa T.; Szolovits, Peter; Chueh, Henry C.

    2017-01-01

    Background The medical subdomain of a clinical note, such as cardiology or neurology, is useful content-derived metadata for developing machine learning downstream applications. To classify the medical subdomain of a note accurately, we have constructed a machine learning-based natural language processing (NLP) pipeline and developed medical subdomain classifiers based on the content of the note. Methods We constructed the pipeline using the clinical ...

  4. Machine Learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  5. Mortality risk prediction in burn injury: Comparison of logistic regression with machine learning approaches.

    Science.gov (United States)

    Stylianou, Neophytos; Akbarov, Artur; Kontopantelis, Evangelos; Buchan, Iain; Dunn, Ken W

    2015-08-01

    Predicting mortality from burn injury has traditionally employed logistic regression models. Alternative machine learning methods have been introduced in some areas of clinical prediction as the necessary software and computational facilities have become accessible. Here we compare logistic regression and machine learning predictions of mortality from burn. An established logistic mortality model was compared to machine learning methods (artificial neural network, support vector machine, random forests and naïve Bayes) using a population-based (England & Wales) case-cohort registry. Predictive evaluation used: area under the receiver operating characteristic curve; sensitivity; specificity; positive predictive value and Youden's index. All methods had comparable discriminatory abilities, similar sensitivities, specificities and positive predictive values. Although some machine learning methods performed marginally better than logistic regression the differences were seldom statistically significant and clinically insubstantial. Random forests were marginally better for high positive predictive value and reasonable sensitivity. Neural networks yielded slightly better prediction overall. Logistic regression gives an optimal mix of performance and interpretability. The established logistic regression model of burn mortality performs well against more complex alternatives. Clinical prediction with a small set of strong, stable, independent predictors is unlikely to gain much from machine learning outside specialist research contexts. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  6. A geometric approach for fault detection and isolation of stator short circuit failure in a single asynchronous machine

    KAUST Repository

    Khelouat, Samir

    2012-06-01

    This paper deals with the problem of detection and isolation of stator short-circuit failure in a single asynchronous machine using a geometric approach. After recalling the basis of the geometric approach for fault detection and isolation in nonlinear systems, we will study some structural properties which are fault detectability and isolation fault filter existence. We will then design filters for residual generation. We will consider two approaches: a two-filters structure and a single filter structure, both aiming at generating residuals which are sensitive to one fault and insensitive to the other faults. Some numerical tests will be presented to illustrate the efficiency of the method.

  7. A comparison of rule-based and machine learning approaches for classifying patient portal messages.

    Science.gov (United States)

    Cronin, Robert M; Fabbri, Daniel; Denny, Joshua C; Rosenbloom, S Trent; Jackson, Gretchen Purcell

    2017-09-01

    Secure messaging through patient portals is an increasingly popular way that consumers interact with healthcare providers. The increasing burden of secure messaging can affect clinic staffing and workflows. Manual management of portal messages is costly and time consuming. Automated classification of portal messages could potentially expedite message triage and delivery of care. We developed automated patient portal message classifiers with rule-based and machine learning techniques using bag of words and natural language processing (NLP) approaches. To evaluate classifier performance, we used a gold standard of 3253 portal messages manually categorized using a taxonomy of communication types (i.e., main categories of informational, medical, logistical, social, and other communications, and subcategories including prescriptions, appointments, problems, tests, follow-up, contact information, and acknowledgement). We evaluated our classifiers' accuracies in identifying individual communication types within portal messages with area under the receiver-operator curve (AUC). Portal messages often contain more than one type of communication. To predict all communication types within single messages, we used the Jaccard Index. We extracted the variables of importance for the random forest classifiers. The best performing approaches to classification for the major communication types were: logistic regression for medical communications (AUC: 0.899); basic (rule-based) for informational communications (AUC: 0.842); and random forests for social communications and logistical communications (AUCs: 0.875 and 0.925, respectively). The best performing classification approach of classifiers for individual communication subtypes was random forests for Logistical-Contact Information (AUC: 0.963). The Jaccard Indices by approach were: basic classifier, Jaccard Index: 0.674; Naïve Bayes, Jaccard Index: 0.799; random forests, Jaccard Index: 0.859; and logistic regression, Jaccard

  8. An empirical comparison of different approaches for combining multimodal neuroimaging data with Support Vector Machine

    Directory of Open Access Journals (Sweden)

    William ePettersson-Yeo

    2014-07-01

    Full Text Available In the pursuit of clinical utility, neuroimaging researchers of psychiatric and neurological illness are increasingly using analyses, such as support vector machine (SVM, that allow inference at the single-subject level. Recent studies employing single-modality data, however, suggest that classification accuracies must be improved for such utility to be realised. One possible solution is to integrate different data types to provide a single combined output classification; either by generating a single decision function based on an integrated kernel matrix, or, by creating an ensemble of multiple single modality classifiers and integrating their predictions. Here, we describe four integrative approaches: 1 an un-weighted sum of kernels, 2 multi-kernel learning, 3 prediction averaging, and 4 majority voting, and compare their ability to enhance classification accuracy relative to the best single-modality classification accuracy. We achieve this by integrating structural, functional and diffusion tensor magnetic resonance imaging data, in order to compare ultra-high risk (UHR; n=19, first episode psychosis (FEP; n=19 and healthy control subjects (HCs; n=19. Our results show that i whilst integration can enhance classification accuracy by up to 13%, the frequency of such instances may be limited, ii where classification can be enhanced, simple methods may yield greater increases relative to more computationally complex alternatives, and, iii the potential for classification enhancement is highly influenced by the specific diagnostic comparison under consideration. In conclusion, our findings suggest that for moderately sized clinical neuroimaging datasets, combining different imaging modalities in a data-driven manner is no magic bullet for increasing classification accuracy.

  9. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music

    Science.gov (United States)

    Giraldo, Sergio I.; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules

  10. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music

    Directory of Open Access Journals (Sweden)

    Sergio Ivan Giraldo

    2016-12-01

    Full Text Available Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1 quantitatively evaluate the accuracy of the induced models, (2 analyse the relative importance of the considered musical features, (3 discuss some of the learnt expressive performance rules in the context of previous work, and (4 assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules’ performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the

  11. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music.

    Science.gov (United States)

    Giraldo, Sergio I; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules.

  12. Designing Green Stormwater Infrastructure for Hydrologic and Human Benefits: An Image Based Machine Learning Approach

    Science.gov (United States)

    Rai, A.; Minsker, B. S.

    2014-12-01

    Urbanization over the last century has degraded our natural water resources by increasing storm-water runoff, reducing nutrient retention, and creating poor ecosystem health downstream. The loss of tree canopy and expansion of impervious area and storm sewer systems have significantly decreased infiltration and evapotranspiration, increased stream-flow velocities, and increased flood risk. These problems have brought increasing attention to catchment-wide implementation of green infrastructure (e.g., decentralized green storm water management practices such as bioswales, rain gardens, permeable pavements, tree box filters, cisterns, urban wetlands, urban forests, stream buffers, and green roofs) to replace or supplement conventional storm water management practices and create more sustainable urban water systems. Current green infrastructure (GI) practice aims at mitigating the negative effects of urbanization by restoring pre-development hydrology and ultimately addressing water quality issues at an urban catchment scale. The benefits of green infrastructure extend well beyond local storm water management, as urban green spaces are also major contributors to human health. Considerable research in the psychological sciences have shown significant human health benefits from appropriately designed green spaces, yet impacts on human wellbeing have not yet been formally considered in GI design frameworks. This research is developing a novel computational green infrastructure (GI) design framework that integrates hydrologic requirements with criteria for human wellbeing. A supervised machine learning model is created to identify specific patterns in urban green spaces that promote human wellbeing; the model is linked to RHESSYS model to evaluate GI designs in terms of both hydrologic and human health benefits. An application of the models to Dead Run Watershed in Baltimore showed that image mining methods were able to capture key elements of human preferences that could

  13. A Machine Learning Approach to Identifying Placebo Responders in Late-Life Depression Trials.

    Science.gov (United States)

    Zilcha-Mano, Sigal; Roose, Steven P; Brown, Patrick J; Rutherford, Bret R

    2018-01-11

    Despite efforts to identify characteristics associated with medication-placebo differences in antidepressant trials, few consistent findings have emerged to guide participant selection in drug development settings and differential therapeutics in clinical practice. Limitations in the methodologies used, particularly searching for a single moderator while treating all other variables as noise, may partially explain the failure to generate consistent results. The present study tested whether interactions between pretreatment patient characteristics, rather than a single-variable solution, may better predict who is most likely to benefit from placebo versus medication. Data were analyzed from 174 patients aged 75 years and older with unipolar depression who were randomly assigned to citalopram or placebo. Model-based recursive partitioning analysis was conducted to identify the most robust significant moderators of placebo versus citalopram response. The greatest signal detection between medication and placebo in favor of medication was among patients with fewer years of education (≤12) who suffered from a longer duration of depression since their first episode (>3.47 years) (B = 2.53, t(32) = 3.01, p = 0.004). Compared with medication, placebo had the greatest response for those who were more educated (>12 years), to the point where placebo almost outperformed medication (B = -0.57, t(96) = -1.90, p = 0.06). Machine learning approaches capable of evaluating the contributions of multiple predictor variables may be a promising methodology for identifying placebo versus medication responders. Duration of depression and education should be considered in the efforts to modulate placebo magnitude in drug development settings and in clinical practice. Copyright © 2018 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  14. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    Science.gov (United States)

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  15. Advancing tuberculosis drug regimen development through innovative quantitative translational pharmacology methods and approaches.

    Science.gov (United States)

    Hanna, Debra; Romero, Klaus; Schito, Marco

    2017-03-01

    The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.

  16. Prediction of In-hospital Mortality in Emergency Department Patients With Sepsis: A Local Big Data-Driven, Machine Learning Approach.

    Science.gov (United States)

    Taylor, R Andrew; Pare, Joseph R; Venkatesh, Arjun K; Mowafi, Hani; Melnick, Edward R; Fleischman, William; Hall, M Kennedy

    2016-03-01

    the 4,222 patients in the training group, 210 (5.0%) died during hospitalization, and of the 1,056 patients in the validation group, 50 (4.7%) died during hospitalization. The AUCs with 95% confidence intervals (CIs) for the different models were as follows: random forest model, 0.86 (95% CI = 0.82 to 0.90); CART model, 0.69 (95% CI = 0.62 to 0.77); logistic regression model, 0.76 (95% CI = 0.69 to 0.82); CURB-65, 0.73 (95% CI = 0.67 to 0.80); MEDS, 0.71 (95% CI = 0.63 to 0.77); and mREMS, 0.72 (95% CI = 0.65 to 0.79). The random forest model AUC was statistically different from all other models (p ≤ 0.003 for all comparisons). In this proof-of-concept study, a local big data-driven, machine learning approach outperformed existing CDRs as well as traditional analytic techniques for predicting in-hospital mortality of ED patients with sepsis. Future research should prospectively evaluate the effectiveness of this approach and whether it translates into improved clinical outcomes for high-risk sepsis patients. The methods developed serve as an example of a new model for predictive analytics in emergency care that can be automated, applied to other clinical outcomes of interest, and deployed in EHRs to enable locally relevant clinical predictions. © 2015 by the Society for Academic Emergency Medicine.

  17. A Comparative Study of "Google Translate" Translations: An Error Analysis of English-to-Persian and Persian-to-English Translations

    Science.gov (United States)

    Ghasemi, Hadis; Hashemian, Mahmood

    2016-01-01

    Both lack of time and the need to translate texts for numerous reasons brought about an increase in studying machine translation with a history spanning over 65 years. During the last decades, Google Translate, as a statistical machine translation (SMT), was in the center of attention for supporting 90 languages. Although there are many studies on…

  18. A Taxonomy of Human Translation Styles

    DEFF Research Database (Denmark)

    Carl, Michael; Dragsted, Barbara; Lykke Jakobsen, Arnt

    2011-01-01

    on the translators' activity data, we develop a taxonomy of translation styles. The taxonomy could serve to inform the development of advanced translation assistance tools and provide a basis for a felicitous and grounded integration of human machine interaction in translation.......While the translation profession becomes increasingly technological, we are still far from understanding how humans actually translate and how they could be best supported by machines. In this paper we outline a method which helps to uncover characteristics of human translation processes. Based...

  19. Pattern recognition & machine learning

    CERN Document Server

    Anzai, Y

    1992-01-01

    This is the first text to provide a unified and self-contained introduction to visual pattern recognition and machine learning. It is useful as a general introduction to artifical intelligence and knowledge engineering, and no previous knowledge of pattern recognition or machine learning is necessary. Basic for various pattern recognition and machine learning methods. Translated from Japanese, the book also features chapter exercises, keywords, and summaries.

  20. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  1. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    Science.gov (United States)

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  2. The Scenario Approach to the Development of Strategy of Prevention of Raider Seizure for Machine-Building Enterprise

    Directory of Open Access Journals (Sweden)

    Momot Tetiana V.

    2017-12-01

    Full Text Available The article proposes the methodical approach to the choice and substantiation of efficiency of managerial decisions on ensuring economic safety at counteraction of raiding, based on an intellectual instrumental analysis. The ranking of alternatives of managerial decisions on the basis of the received weighted estimates and their fuzzy composition is used. A graphical interpretation of the membership functions of the calculated fuzzy expected utilities of management alternatives for the machine-building enterprises has been constructed and is presented.

  3. Pre-clinical research in small animals using radiotherapy technology. A bidirectional translational approach

    International Nuclear Information System (INIS)

    Tillner, Falk; Buetof, Rebecca; Krause, Mechthild; Enghardt, Wolfgang; Helmholtz-Zentrum Dresden-Rossendorf, Dresden; Technische Univ. Dresden; Helmholtz-Zentrum Dresden-Rossendorf, Dresden

    2014-01-01

    For translational cancer research, pre-clinical in-vivo studies using small animals have become indispensable in bridging the gap between in-vitro cell experiments and clinical implementation. When setting up such small animal experiments, various biological, technical and methodical aspects have to be considered. In this work we present a comprehensive topical review based on relevant publications on irradiation techniques used for pre-clinical cancer research in mice and rats. Clinical radiotherapy treatment devices for the application of external beam radiotherapy and brachytherapy as well as dedicated research irradiation devices are feasible for small animal irradiation depending on the animal model and the experimental goals. In this work, appropriate solutions for the technological transfer of human radiation oncology to small animal radiation research are summarised. Additionally, important information concerning the experimental design is provided such that reliable and clinically relevant results can be attained.

  4. Translating tDCS into the field of obesity: mechanism-driven approaches

    Directory of Open Access Journals (Sweden)

    Miguel eAlonso-Alonso

    2013-08-01

    Full Text Available Transcranial direct current stimulation (tDCS is emerging as a promising technique for neuromodulation in a variety of clinical conditions. Recent neuroimaging studies suggest that modifying the activity of brain circuits involved in eating behavior could provide therapeutic benefits in obesity. One session of tDCS over the dorsolateral prefrontal cortex can induce an acute decrease in food craving, according to three small clinical trials, but the extension of these findings into the field of obesity remains unexplored. Importantly, there has been little/no interaction of our current understanding of tDCS and its mechanisms with obesity-related research. How can we start closing this gap and rationally guide the translation of tDCS into the field of obesity? In this mini-review I summarize some of the challenges and questions ahead, related to basic science and technical aspects, and suggest future directions.

  5. Pre-clinical research in small animals using radiotherapy technology--a bidirectional translational approach.

    Science.gov (United States)

    Tillner, Falk; Thute, Prasad; Bütof, Rebecca; Krause, Mechthild; Enghardt, Wolfgang

    2014-12-01

    For translational cancer research, pre-clinical in-vivo studies using small animals have become indispensable in bridging the gap between in-vitro cell experiments and clinical implementation. When setting up such small animal experiments, various biological, technical and methodical aspects have to be considered. In this work we present a comprehensive topical review based on relevant publications on irradiation techniques used for pre-clinical cancer research in mice and rats. Clinical radiotherapy treatment devices for the application of external beam radiotherapy and brachytherapy as well as dedicated research irradiation devices are feasible for small animal irradiation depending on the animal model and the experimental goals. In this work, appropriate solutions for the technological transfer of human radiation oncology to small animal radiation research are summarised. Additionally, important information concerning the experimental design is provided such that reliable and clinically relevant results can be attained. Copyright © 2014. Published by Elsevier GmbH.

  6. Pre-clinical research in small animals using radiotherapy technology. A bidirectional translational approach

    Energy Technology Data Exchange (ETDEWEB)

    Tillner, Falk; Buetof, Rebecca [Technische Univ. Dresden (Germany). OncoRay - National Center for Radiation Research in Oncology; Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Univ. Dresden (Germany). Dept. of Radiation Oncology; Thute, Prasad [Technische Univ. Dresden (Germany). OncoRay - National Center for Radiation Research in Oncology; Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Krause, Mechthild [Technische Univ. Dresden (Germany). OncoRay - National Center for Radiation Research in Oncology; Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Univ. Dresden (Germany). Dept. of Radiation Oncology; German Cancer Consortium (DKTK), Dresden (Germany); German Cancer Research Center (DKFZ), Heidelberg (Germany); Enghardt, Wolfgang [Technische Univ. Dresden (Germany). OncoRay - National Center for Radiation Research in Oncology; Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Univ. Dresden (Germany). Dept. of Radiation Oncology; Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany). Inst. of Radiooncology

    2014-07-01

    For translational cancer research, pre-clinical in-vivo studies using small animals have become indispensable in bridging the gap between in-vitro cell experiments and clinical implementation. When setting up such small animal experiments, various biological, technical and methodical aspects have to be considered. In this work we present a comprehensive topical review based on relevant publications on irradiation techniques used for pre-clinical cancer research in mice and rats. Clinical radiotherapy treatment devices for the application of external beam radiotherapy and brachytherapy as well as dedicated research irradiation devices are feasible for small animal irradiation depending on the animal model and the experimental goals. In this work, appropriate solutions for the technological transfer of human radiation oncology to small animal radiation research are summarised. Additionally, important information concerning the experimental design is provided such that reliable and clinically relevant results can be attained.

  7. Investigation of High-Speed Cryogenic Machining Based on Finite Element Approach

    Directory of Open Access Journals (Sweden)

    Pooyan Vahidi Pashaki

    Full Text Available Abstract The simulation of cryogenic machining process because of using a three-dimensional model and high process duration time in the finite element method, have been studied rarely. In this study, to overcome this limitation, a 2.5D finite element model using the commercial finite element software ABAQUS has been developed for the cryogenic machining process and by considering more realistic assumptions, the chip formation procedure investigated. In the proposed method, the liquid nitrogen has been used as a coolant. At the modeling of friction during the interaction of tools - chip, the Coulomb law has been used. In order to simulate the behavior of plasticity and failure criterion, Johnson-Cook model was used, and unlike previous investigations, thermal and mechanical properties of materials as a function of temperature were applied to the software. After examining accuracy of the model with present experimental data, the effect of parameters such as rake angle and the cutting speed as well as dry machining of aluminum alloy by the use of coupled dynamic temperature solution has been studied. Results indicated that at the cutting velocity of 10 m/s, cryogenic cooling has caused into decreasing 60 percent of tools temperature in comparison with the dry cooling. Furthermore, a chip which has been made by cryogenic machining were connected and without fracture in contrast to dry machining.

  8. Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach.

    Science.gov (United States)

    Pasupa, Kitsuchart; Kudisthalert, Wasu

    2018-01-01

    Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets-Maximum Unbiased Validation Dataset-which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6.

  9. The reflection of evolving bearing faults in the stator current's extended park vector approach for induction machines

    Science.gov (United States)

    Corne, Bram; Vervisch, Bram; Derammelaere, Stijn; Knockaert, Jos; Desmet, Jan

    2018-07-01

    Stator current analysis has the potential of becoming the most cost-effective condition monitoring technology regarding electric rotating machinery. Since both electrical and mechanical faults are detected by inexpensive and robust current-sensors, measuring current is advantageous on other techniques such as vibration, acoustic or temperature analysis. However, this technology is struggling to breach into the market of condition monitoring as the electrical interpretation of mechanical machine-problems is highly complicated. Recently, the authors built a test-rig which facilitates the emulation of several representative mechanical faults on an 11 kW induction machine with high accuracy and reproducibility. Operating this test-rig, the stator current of the induction machine under test can be analyzed while mechanical faults are emulated. Furthermore, while emulating, the fault-severity can be manipulated adaptively under controllable environmental conditions. This creates the opportunity of examining the relation between the magnitude of the well-known current fault components and the corresponding fault-severity. This paper presents the emulation of evolving bearing faults and their reflection in the Extended Park Vector Approach for the 11 kW induction machine under test. The results confirm the strong relation between the bearing faults and the stator current fault components in both identification and fault-severity. Conclusively, stator current analysis increases reliability in the application as a complete, robust, on-line condition monitoring technology.

  10. Clinical translation and regulatory aspects of CAR/TCR-based adoptive cell therapies-the German Cancer Consortium approach.

    Science.gov (United States)

    Krackhardt, Angela M; Anliker, Brigitte; Hildebrandt, Martin; Bachmann, Michael; Eichmüller, Stefan B; Nettelbeck, Dirk M; Renner, Matthias; Uharek, Lutz; Willimsky, Gerald; Schmitt, Michael; Wels, Winfried S; Schüssler-Lenz, Martina

    2018-04-01

    Adoptive transfer of T cells genetically modified by TCRs or CARs represents a highly attractive novel therapeutic strategy to treat malignant diseases. Various approaches for the development of such gene therapy medicinal products (GTMPs) have been initiated by scientists in recent years. To date, however, the number of clinical trials commenced in Germany and Europe is still low. Several hurdles may contribute to the delay in clinical translation of these therapeutic innovations including the significant complexity of manufacture and non-clinical testing of these novel medicinal products, the limited knowledge about the intricate regulatory requirements of the academic developers as well as limitations of funds for clinical testing. A suitable good manufacturing practice (GMP) environment is a key prerequisite and platform for the development, validation, and manufacture of such cell-based therapies, but may also represent a bottleneck for clinical translation. The German Cancer Consortium (DKTK) and the Paul-Ehrlich-Institut (PEI) have initiated joint efforts of researchers and regulators to facilitate and advance early phase, academia-driven clinical trials. Starting with a workshop held in 2016, stakeholders from academia and regulatory authorities in Germany have entered into continuing discussions on a diversity of scientific, manufacturing, and regulatory aspects, as well as the benefits and risks of clinical application of CAR/TCR-based cell therapies. This review summarizes the current state of discussions of this cooperative approach providing a basis for further policy-making and suitable modification of processes.

  11. On Literal Translation of English Idioms

    Science.gov (United States)

    Chen, Linli

    2009-01-01

    There are six translation tactics in translating English idioms into Chinese: literal translation, compensatory translation, free translation, explanational translation, borrowing, integrated approach. Each tactic should be reasonably employed in the process of translating, so as to keep the flavor of the original English idioms as well as to…

  12. Quality assurance of a helical tomotherapy machine

    International Nuclear Information System (INIS)

    Fenwick, J D; Tome, W A; Jaradat, H A; Hui, S K; James, J A; Balog, J P; DeSouza, C N; Lucas, D B; Olivera, G H; Mackie, T R; Paliwal, B R

    2004-01-01

    Helical tomotherapy has been developed at the University of Wisconsin, and 'Hi-Art II' clinical machines are now commercially manufactured. At the core of each machine lies a ring-gantry-mounted short linear accelerator which generates x-rays that are collimated into a fan beam of intensity-modulated radiation by a binary multileaf, the modulation being variable with gantry angle. Patients are treated lying on a couch which is translated continuously through the bore of the machine as the gantry rotates. Highly conformal dose-distributions can be delivered using this technique, which is the therapy equivalent of spiral computed tomography. The approach requires synchrony of gantry rotation, couch translation, accelerator pulsing and the opening and closing of the leaves of the binary multileaf collimator used to modulate the radiation beam. In the course of clinically implementing helical tomotherapy, we have developed a quality assurance (QA) system for our machine. The system is analogous to that recommended for conventional clinical linear accelerator QA by AAPM Task Group 40 but contains some novel components, reflecting differences between the Hi-Art devices and conventional clinical accelerators. Here the design and dosimetric characteristics of Hi-Art machines are summarized and the QA system is set out along with experimental details of its implementation. Connections between this machine-based QA work, pre-treatment patient-specific delivery QA and fraction-by-fraction dose verification are discussed

  13. Numerical approach for optimum electromagnetic parameters of electrical machines used in vehicle traction applications

    International Nuclear Information System (INIS)

    Fodorean, D.; Giurgea, S.; Djerdir, A.; Miraoui, A.

    2009-01-01

    A large speed variation is an essential request in the automobile industry. In order to compete with diesel engines, the flux weakening technique has to be employed on the electrical machines. In this way, appropriate electromagnetic and geometrical parameters can give the desired speed. Using the inverse problem method coupled with numerical analysis by finite element method (FEM), the authors propose an optimum parameters configuration that maximizes the speed domain operation. Several types of electrical machines are under study: induction, synchronous permanent magnet, variable reluctance and transverse flux machines, respectively. With a proper non-linear model, by using analytical and numerical calculation, the authors propose an optimum solution for the speed variation of the studied drives, which will be standing for a final comparison.

  14. A FLEXIBLE APPROACH TO THE BENEFICIAL USE OF MACHINING POWER IN MILLING PROCESS

    Directory of Open Access Journals (Sweden)

    Faruk MENDİ

    2000-01-01

    Full Text Available In this study, a computer program has been developed to calculate the cutting speed (Vc and depth of cut (a under certain cutting condition which will be determined by the user for making maximum use of the various milling machines used presently. In the scope of the study, data related to the cutting speed (Vc and depth of cut (a of St 37 Steel and powers of the machine have been obtained. The aim is to make maximum use of the machines by choosing cutting speed (Vc and depth of cut (a with the help data. The effect of changes in cutting speed (Vc and depth of cut (a on productivity has been studied.

  15. Self-commissioning of permanent magnet synchronous machine drives using hybrid approach

    DEFF Research Database (Denmark)

    Basar, Mehmet Sertug

    2016-01-01

    Self-commissioning of permanent-magnet (PM) synchronous machines (PMSMs) is of prime importance in an industrial drive system because control performance and system stability depend heavily on the accurate machine parameter information. This article focuses on a combination of offline and online...... parameter estimation for a non-salient pole PMSM which eliminates the need for any prior knowledge on machine parameters. Stator resistance and inductance are first identified at standstill utilising fundamental and high-frequency excitation signals, respectively. A novel method has been developed...... and employed for inductance estimation. Then, stator resistance, inductance and PM flux are updated online using a recursive least-squares (RLS) algorithm. The proposed controllers are designed using MATLAB/Simulink® and implemented on d-Space® real-time system incorporating a commercially available PMSM drive....

  16. Expression, Purification, and Analysis of Unknown Translation Factors from "Escherichia Coli": A Synthesis Approach

    Science.gov (United States)

    Walter, Justin D.; Littlefield, Peter; Delbecq, Scott; Prody, Gerry; Spiegel, P. Clint

    2010-01-01

    New approaches are currently being developed to expose biochemistry and molecular biology undergraduates to a more interactive learning environment. Here, we propose a unique project-based laboratory module, which incorporates exposure to biophysical chemistry approaches to address problems in protein chemistry. Each of the experiments described…

  17. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  18. Vibration Monitoring of Gas Turbine Engines: Machine-Learning Approaches and Their Challenges

    Directory of Open Access Journals (Sweden)

    Ioannis Matthaiou

    2017-09-01

    Full Text Available In this study, condition monitoring strategies are examined for gas turbine engines using vibration data. The focus is on data-driven approaches, for this reason a novelty detection framework is considered for the development of reliable data-driven models that can describe the underlying relationships of the processes taking place during an engine’s operation. From a data analysis perspective, the high dimensionality of features extracted and the data complexity are two problems that need to be dealt with throughout analyses of this type. The latter refers to the fact that the healthy engine state data can be non-stationary. To address this, the implementation of the wavelet transform is examined to get a set of features from vibration signals that describe the non-stationary parts. The problem of high dimensionality of the features is addressed by “compressing” them using the kernel principal component analysis so that more meaningful, lower-dimensional features can be used to train the pattern recognition algorithms. For feature discrimination, a novelty detection scheme that is based on the one-class support vector machine (OCSVM algorithm is chosen for investigation. The main advantage, when compared to other pattern recognition algorithms, is that the learning problem is being cast as a quadratic program. The developed condition monitoring strategy can be applied for detecting excessive vibration levels that can lead to engine component failure. Here, we demonstrate its performance on vibration data from an experimental gas turbine engine operating on different conditions. Engine vibration data that are designated as belonging to the engine’s “normal” condition correspond to fuels and air-to-fuel ratio combinations, in which the engine experienced low levels of vibration. Results demonstrate that such novelty detection schemes can achieve a satisfactory validation accuracy through appropriate selection of two parameters of the

  19. An empirical comparison of different approaches for combining multimodal neuroimaging data with support vector machine.

    Science.gov (United States)

    Pettersson-Yeo, William; Benetti, Stefania; Marquand, Andre F; Joules, Richard; Catani, Marco; Williams, Steve C R; Allen, Paul; McGuire, Philip; Mechelli, Andrea

    2014-01-01

    In the pursuit of clinical utility, neuroimaging researchers of psychiatric and neurological illness are increasingly using analyses, such as support vector machine, that allow inference at the single-subject level. Recent studies employing single-modality data, however, suggest that classification accuracies must be improved for such utility to be realized. One possible solution is to integrate different data types to provide a single combined output classification; either by generating a single decision function based on an integrated kernel matrix, or, by creating an ensemble of multiple single modality classifiers and integrating their predictions. Here, we describe four integrative approaches: (1) an un-weighted sum of kernels, (2) multi-kernel learning, (3) prediction averaging, and (4) majority voting, and compare their ability to enhance classification accuracy relative to the best single-modality classification accuracy. We achieve this by integrating structural, functional, and diffusion tensor magnetic resonance imaging data, in order to compare ultra-high risk (n = 19), first episode psychosis (n = 19) and healthy control subjects (n = 23). Our results show that (i) whilst integration can enhance classification accuracy by up to 13%, the frequency of such instances may be limited, (ii) where classification can be enhanced, simple methods may yield greater increases relative to more computationally complex alternatives, and, (iii) the potential for classification enhancement is highly influenced by the specific diagnostic comparison under consideration. In conclusion, our findings suggest that for moderately sized clinical neuroimaging datasets, combining different imaging modalities in a data-driven manner is no "magic bullet" for increasing classification accuracy. However, it remains possible that this conclusion is dependent on the use of neuroimaging modalities that had little, or no, complementary information to offer one another, and that the

  20. Efficient approach to simulate EM loads on massive structures in ITER machine

    Energy Technology Data Exchange (ETDEWEB)

    Alekseev, A. [ITER Organization, Route de Vinon sur Verdon, 13115 St. Paul-Lez-Durance (France); Andreeva, Z.; Belov, A.; Belyakov, V.; Filatov, O. [D.V. Efremov Scientific Research Institute, 196641 St. Petersburg (Russian Federation); Gribov, Yu.; Ioki, K. [ITER Organization, Route de Vinon sur Verdon, 13115 St. Paul-Lez-Durance (France); Kukhtin, V.; Labusov, A.; Lamzin, E.; Lyublin, B.; Malkov, A.; Mazul, I. [D.V. Efremov Scientific Research Institute, 196641 St. Petersburg (Russian Federation); Rozov, V.; Sugihara, M. [ITER Organization, Route de Vinon sur Verdon, 13115 St. Paul-Lez-Durance (France); Sychevsky, S., E-mail: sytch@sintez.niiefa.spb.su [D.V. Efremov Scientific Research Institute, 196641 St. Petersburg (Russian Federation)

    2013-10-15

    Highlights: ► A modelling technique to predict EM loads in ITER conducting structures is presented. ► The technique provides low computational cost and parallel computations. ► Detailed models were built for the system “vacuum vessel, cryostat, thermal shields”. ► EM loads on massive in-vessel structures were simulated with the use of local models. ► A flexible combination of models enables desired accuracy of load distributions. -- Abstract: Operation of the ITER machine is associated with high electromagnetic (EM) loads. An essential contributor to EM loads is eddy currents induced in passive conductive structures. Reasoning from the ITER construction, a modelling technique has been developed and applied in computations to efficiently predict anticipated loads. The technique allows us to avoid building a global 3D finite-element (FE) model that requires meshing of the conducting structures and their vacuum environment into 3D solid elements that leads to high computational cost. The key features of the proposed technique are: (i) the use of an existing shell model for the system “vacuum vessel (VV), cryostat, and thermal shields (TS)” implementing the magnetic shell approach. A solution is obtained in terms of a single-component, in this case, vector electric potential taken within the conducting shells of the “VV + cryostat + TS” system. (ii) EM loads on in-vessel conducting structures are simulated with the use of local FE models. The local models use either the 3D solid body or shell approximations. Reasoning from the simulation efficiency, the local boundary conditions are put with respect to the total field or an external field. The use of an integral-differential formulation and special procedures ensures smooth and accurate simulated distributions of fields from current sources of any geometry. The local FE models have been developed and applied for EM analyses of a variety of the ITER components including the diagnostic systems

  1. An information-theoretic machine learning approach to expression QTL analysis.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available Expression Quantitative Trait Locus (eQTL analysis is a powerful tool to study the biological mechanisms linking the genotype with gene expression. Such analyses can identify genomic locations where genotypic variants influence the expression of genes, both in close proximity to the variant (cis-eQTL, and on other chromosomes (trans-eQTL. Many traditional eQTL methods are based on a linear regression model. In this study, we propose a novel method by which to identify eQTL associations with information theory and machine learning approaches. Mutual Information (MI is used to describe the association between genetic marker and gene expression. MI can detect both linear and non-linear associations. What's more, it can capture the heterogeneity of the population. Advanced feature selection methods, Maximum Relevance Minimum Redundancy (mRMR and Incremental Feature Selection (IFS, were applied to optimize the selection of the affected genes by the genetic marker. When we applied our method to a study of apoE-deficient mice, it was found that the cis-acting eQTLs are stronger than trans-acting eQTLs but there are more trans-acting eQTLs than cis-acting eQTLs. We compared our results (mRMR.eQTL with R/qtl, and MatrixEQTL (modelLINEAR and modelANOVA. In female mice, 67.9% of mRMR.eQTL results can be confirmed by at least two other methods while only 14.4% of R/qtl result can be confirmed by at least two other methods. In male mice, 74.1% of mRMR.eQTL results can be confirmed by at least two other methods while only 18.2% of R/qtl result can be confirmed by at least two other methods. Our methods provide a new way to identify the association between genetic markers and gene expression. Our software is available from supporting information.

  2. Detecting epileptic seizure with different feature extracting strategies using robust machine learning classification techniques by applying advance parameter optimization approach.

    Science.gov (United States)

    Hussain, Lal

    2018-06-01

    Epilepsy is a neurological disorder produced due to abnormal excitability of neurons in the brain. The research reveals that brain activity is monitored through electroencephalogram (EEG) of patients suffered from seizure to detect the epileptic seizure. The performance of EEG detection based epilepsy require feature extracting strategies. In this research, we have extracted varying features extracting strategies based on time and frequency domain characteristics, nonlinear, wavelet based entropy and few statistical features. A deeper study was undertaken using novel machine learning classifiers by considering multiple factors. The support vector machine kernels are evaluated based on multiclass kernel and box constraint level. Likewise, for K-nearest neighbors (KNN), we computed the different distance metrics, Neighbor weights and Neighbors. Similarly, the decision trees we tuned the paramours based on maximum splits and split criteria and ensemble classifiers are evaluated based on different ensemble methods and learning rate. For training/testing tenfold Cross validation was employed and performance was evaluated in form of TPR, NPR, PPV, accuracy and AUC. In this research, a deeper analysis approach was performed using diverse features extracting strategies using robust machine learning classifiers with more advanced optimal options. Support Vector Machine linear kernel and KNN with City block distance metric give the overall highest accuracy of 99.5% which was higher than using the default parameters for these classifiers. Moreover, highest separation (AUC = 0.9991, 0.9990) were obtained at different kernel scales using SVM. Additionally, the K-nearest neighbors with inverse squared distance weight give higher performance at different Neighbors. Moreover, to distinguish the postictal heart rate oscillations from epileptic ictal subjects, and highest performance of 100% was obtained using different machine learning classifiers.

  3. Translational systems biology: introduction of an engineering approach to the pathophysiology of the burn patient.

    Science.gov (United States)

    An, Gary; Faeder, James; Vodovotz, Yoram

    2008-01-01

    The pathophysiology of the burn patient manifests the full spectrum of the complexity of the inflammatory response. In the acute phase, inflammation may have negative effects via capillary leak, the propagation of inhalation injury, and development of multiple organ failure. Attempts to mediate these processes remain a central subject of burn care research. Conversely, inflammation is a necessary prologue and component in the later stage processes of wound healing. Despite the volume of information concerning the cellular and molecular processes involved in inflammation, there exists a significant gap between the knowledge of mechanistic pathophysiology and the development of effective clinical therapeutic regimens. Translational systems biology (TSB) is the application of dynamic mathematical modeling and certain engineering principles to biological systems to integrate mechanism with phenomenon and, importantly, to revise clinical practice. This study will review the existing applications of TSB in the areas of inflammation and wound healing, relate them to specific areas of interest to the burn community, and present an integrated framework that links TSB with traditional burn research.

  4. NOVEL APPROACH TO IMPROVE GEOCENTRIC TRANSLATION MODEL PERFORMANCE USING ARTIFICIAL NEURAL NETWORK TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Yao Yevenyo Ziggah

    Full Text Available Abstract: Geocentric translation model (GTM in recent times has not gained much popularity in coordinate transformation research due to its attainable accuracy. Accurate transformation of coordinate is a major goal and essential procedure for the solution of a number of important geodetic problems. Therefore, motivated by the successful application of Artificial Intelligence techniques in geodesy, this study developed, tested and compared a novel technique capable of improving the accuracy of GTM. First, GTM based on official parameters (OP and new parameters determined using the arithmetic mean (AM were applied to transform coordinate from global WGS84 datum to local Accra datum. On the basis of the results, the new parameters (AM attained a maximum horizontal position error of 1.99 m compared to the 2.75 m attained by OP. In line with this, artificial neural network technology of backpropagation neural network (BPNN, radial basis function neural network (RBFNN and generalized regression neural network (GRNN were then used to compensate for the GTM generated errors based on AM parameters to obtain a new coordinate transformation model. The new implemented models offered significant improvement in the horizontal position error from 1.99 m to 0.93 m.

  5. A modern approach for the realisation of the man-machine information system

    International Nuclear Information System (INIS)

    Martin, D.; Grensemann, D.

    1980-01-01

    The tool for man-machine communication has been given the name VISCOMP-VISual COmmunication Man-Process. Dependent on the project implementation phase VISCOMP functions either autonomously (planning and definition phases) or in conjunction with a data acquisition system during the operational phase on the plant. The relationship of the required components to the project phases is illustrated. (orig./HP)

  6. SVM-Maj: a majorization approach to linear support vector machines with different hinge errors

    NARCIS (Netherlands)

    P.J.F. Groenen (Patrick); G.I. Nalbantov (Georgi); J.C. Bioch (Cor)

    2007-01-01

    textabstractSupport vector machines (SVM) are becoming increasingly popular for the prediction of a binary dependent variable. SVMs perform very well with respect to competing techniques. Often, the solution of an SVM is obtained by switching to the dual. In this paper, we stick to the primal

  7. A Machine Learning Approach for Hot-Spot Detection at Protein-Protein Interfaces

    NARCIS (Netherlands)

    Melo, Rita; Fieldhouse, Robert; Melo, André; Correia, João D G; Cordeiro, Maria Natália D S; Gümüş, Zeynep H; Costa, Joaquim; Bonvin, Alexandre M J J; de Sousa Moreira, Irina

    2016-01-01

    Understanding protein-protein interactions is a key challenge in biochemistry. In this work, we describe a more accurate methodology to predict Hot-Spots (HS) in protein-protein interfaces from their native complex structure compared to previous published Machine Learning (ML) techniques. Our model

  8. An empirical comparison of different approaches for combining multimodal neuroimaging data with support vector machine

    NARCIS (Netherlands)

    Pettersson-Yeo, W.; Benetti, S.; Marquand, A.F.; Joules, R.; Catani, M.; Williams, S.C.; Allen, P.; McGuire, P.; Mechelli, A.

    2014-01-01

    In the pursuit of clinical utility, neuroimaging researchers of psychiatric and neurological illness are increasingly using analyses, such as support vector machine, that allow inference at the single-subject level. Recent studies employing single-modality data, however, suggest that classification

  9. Reliability Evaluation and Improvement Approach of Chemical Production Man - Machine - Environment System

    Science.gov (United States)

    Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng

    2017-12-01

    In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.

  10. Introduction to special issue on machine learning approaches to shallow parsing

    NARCIS (Netherlands)

    Hammerton, J; Osborne, M; Armstrong, S; Daelemans, W

    2002-01-01

    This article introduces the problem of partial or shallow parsing (assigning partial syntactic structure to sentences) and explains why it is an important natural language processing (NLP) task. The complexity of the task makes Machine Learning an attractive option in comparison to the handcrafting

  11. A Support Vector Machine Approach to Dutch Part-of-Speech Tagging

    NARCIS (Netherlands)

    Poel, Mannes; Stegeman, L.; op den Akker, Hendrikus J.A.; Berthold, M.R.; Shawe-Taylor, J.; Lavrac, N.

    Part-of-Speech tagging, the assignment of Parts-of-Speech to the words in a given context of use, is a basic technique in many systems that handle natural languages. This paper describes a method for supervised training of a Part-of-Speech tagger using a committee of Support Vector Machines on a

  12. Feature selection in wind speed prediction systems based on a hybrid coral reefs optimization – Extreme learning machine approach

    International Nuclear Information System (INIS)

    Salcedo-Sanz, S.; Pastor-Sánchez, A.; Prieto, L.; Blanco-Aguilera, A.; García-Herrera, R.

    2014-01-01

    Highlights: • A novel approach for short-term wind speed prediction is presented. • The system is formed by a coral reefs optimization algorithm and an extreme learning machine. • Feature selection is carried out with the CRO to improve the ELM performance. • The method is tested in real wind farm data in USA, for the period 2007–2008. - Abstract: This paper presents a novel approach for short-term wind speed prediction based on a Coral Reefs Optimization algorithm (CRO) and an Extreme Learning Machine (ELM), using meteorological predictive variables from a physical model (the Weather Research and Forecast model, WRF). The approach is based on a Feature Selection Problem (FSP) carried out with the CRO, that must obtain a reduced number of predictive variables out of the total available from the WRF. This set of features will be the input of an ELM, that finally provides the wind speed prediction. The CRO is a novel bio-inspired approach, based on the simulation of reef formation and coral reproduction, able to obtain excellent results in optimization problems. On the other hand, the ELM is a new paradigm in neural networks’ training, that provides a robust and extremely fast training of the network. Together, these algorithms are able to successfully solve this problem of feature selection in short-term wind speed prediction. Experiments in a real wind farm in the USA show the excellent performance of the CRO–ELM approach in this FSP wind speed prediction problem

  13. On the Systematicity of Human Translation Processes

    DEFF Research Database (Denmark)

    Carl, Michael; Dragsted, Barbara; Lykke Jakobsen, Arnt

    While translation careers and the translation profession become more globalised and more technological, we are still far from understanding how humans actually translate and how they could be best supported by machines. In this paper we attempt to outline a method which helps to uncover character......While translation careers and the translation profession become more globalised and more technological, we are still far from understanding how humans actually translate and how they could be best supported by machines. In this paper we attempt to outline a method which helps to uncover...... characteristic steps in human translation processes. Based on the translators' activity data, we develop a taxonomy of translation styles, which are characteristic for different kinds of translators. The taxonomy could serve to inform the development of advanced translation assistance tools and provide a basis...

  14. Heparan Sulfate Induces Necroptosis in Murine Cardiomyocytes: A Medical-In silico Approach Combining In vitro Experiments and Machine Learning.

    Science.gov (United States)

    Zechendorf, Elisabeth; Vaßen, Phillip; Zhang, Jieyi; Hallawa, Ahmed; Martincuks, Antons; Krenkel, Oliver; Müller-Newen, Gerhard; Schuerholz, Tobias; Simon, Tim-Philipp; Marx, Gernot; Ascheid, Gerd; Schmeink, Anke; Dartmann, Guido; Thiemermann, Christoph; Martin, Lukas

    2018-01-01

    Life-threatening cardiomyopathy is a severe, but common, complication associated with severe trauma or sepsis. Several signaling pathways involved in apoptosis and necroptosis are linked to trauma- or sepsis-associated cardiomyopathy. However, the underling causative factors are still debatable. Heparan sulfate (HS) fragments belong to the class of danger/damage-associated molecular patterns liberated from endothelial-bound proteoglycans by heparanase during tissue injury associated with trauma or sepsis. We hypothesized that HS induces apoptosis or necroptosis in murine cardiomyocytes. By using a novel Medical- In silico approach that combines conventional cell culture experiments with machine learning algorithms, we aimed to reduce a significant part of the expensive and time-consuming cell culture experiments and data generation by using computational intelligence (refinement and replacement). Cardiomyocytes exposed to HS showed an activation of the intrinsic apoptosis signal pathway via cytochrome C and the activation of caspase 3 (both p  machine learning algorithms.

  15. Quantitative diagnosis of breast tumors by morphometric classification of microenvironmental myoepithelial cells using a machine learning approach.

    Science.gov (United States)

    Yamamoto, Yoichiro; Saito, Akira; Tateishi, Ayako; Shimojo, Hisashi; Kanno, Hiroyuki; Tsuchiya, Shinichi; Ito, Ken-Ichi; Cosatto, Eric; Graf, Hans Peter; Moraleda, Rodrigo R; Eils, Roland; Grabe, Niels

    2017-04-25

    Machine learning systems have recently received increased attention for their broad applications in several fields. In this study, we show for the first time that histological types of breast tumors can be classified using subtle morphological differences of microenvironmental myoepithelial cell nuclei without any direct information about neoplastic tumor cells. We quantitatively measured 11661 nuclei on the four histological types: normal cases, usual ductal hyperplasia and low/high grade ductal carcinoma in situ (DCIS). Using a machine learning system, we succeeded in classifying the four histological types with 90.9% accuracy. Electron microscopy observations suggested that the activity of typical myoepithelial cells in DCIS was lowered. Through these observations as well as meta-analytic database analyses, we developed a paracrine cross-talk-based biological mechanism of DCIS progressing to invasive cancer. Our observations support novel approaches in clinical computational diagnostics as well as in therapy development against progression.

  16. Combining macula clinical signs and patient characteristics for age-related macular degeneration diagnosis: a machine learning approach.

    Science.gov (United States)

    Fraccaro, Paolo; Nicolo, Massimo; Bonetto, Monica; Giacomini, Mauro; Weller, Peter; Traverso, Carlo Enrico; Prosperi, Mattia; OSullivan, Dympna

    2015-01-27

    To investigate machine learning methods, ranging from simpler interpretable techniques to complex (non-linear) "black-box" approaches, for automated diagnosis of Age-related Macular Degeneration (AMD). Data from healthy subjects and patients diagnosed with AMD or other retinal diseases were collected during routine visits via an Electronic Health Record (EHR) system. Patients' attributes included demographics and, for each eye, presence/absence of major AMD-related clinical signs (soft drusen, retinal pigment epitelium, defects/pigment mottling, depigmentation area, subretinal haemorrhage, subretinal fluid, macula thickness, macular scar, subretinal fibrosis). Interpretable techniques known as white box methods including logistic regression and decision trees as well as less interpreitable techniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoost, were used to develop models (trained and validated on unseen data) to diagnose AMD. The gold standard was confirmed diagnosis of AMD by physicians. Sensitivity, specificity and area under the receiver operating characteristic (AUC) were used to assess performance. Study population included 487 patients (912 eyes). In terms of AUC, random forests, logistic regression and adaboost showed a mean performance of (0.92), followed by SVM and decision trees (0.90). All machine learning models identified soft drusen and age as the most discriminating variables in clinicians' decision pathways to diagnose AMD. Both black-box and white box methods performed well in identifying diagnoses of AMD and their decision pathways. Machine learning models developed through the proposed approach, relying on clinical signs identified by retinal specialists, could be embedded into EHR to provide physicians with real time (interpretable) support.

  17. The psychopharmacology of aggressive behavior: a translational approach: part 2: clinical studies using atypical antipsychotics, anticonvulsants, and lithium.

    Science.gov (United States)

    Comai, Stefano; Tau, Michael; Pavlovic, Zoran; Gobbi, Gabriella

    2012-04-01

    Patients experiencing mental disorders are at an elevated risk for developing aggressive behavior. In the past 10 years, the psychopharmacological treatment of aggression has changed dramatically owing to the introduction of atypical antipsychotics on the market and the increased use of anticonvulsants and lithium in the treatment of aggressive patients.This review (second of 2 parts) uses a translational medicine approach to examine the neurobiology of aggression, discussing the major neurotransmitter systems implicated in its pathogenesis (serotonin, glutamate, norepinephrine, dopamine, and γ-aminobutyric acid) and the neuropharmacological rationale for using atypical antipsychotics, anticonvulsants, and lithium in the therapeutics of aggressive behavior. A critical review of all clinical trials using atypical antipsychotics (aripiprazole, clozapine, loxapine, olanzapine, quetiapine, risperidone, ziprasidone, and amisulpride), anticonvulsants (topiramate, valproate, lamotrigine, and gabapentin), and lithium are presented. Given the complex, multifaceted nature of aggression, a multifunctional combined therapy, targeting different receptors, seems to be the best strategy for treating aggressive behavior. This therapeutic strategy is supported by translational studies and a few human studies, even if additional randomized, double-blind, clinical trials are needed to confirm the clinical efficacy of this framework.

  18. Mass-spectrometry analysis of histone post-translational modifications in pathology tissue using the PAT-H-MS approach

    Directory of Open Access Journals (Sweden)

    Roberta Noberini

    2016-06-01

    Full Text Available Aberrant histone post-translational modifications (hPTMs have been implicated with various pathologies, including cancer, and may represent useful epigenetic biomarkers. The data described here provide a mass spectrometry-based quantitative analysis of hPTMs from formalin-fixed paraffin-embedded (FFPE tissues, from which histones were extracted through the recently developed PAT-H-MS method. First, we analyzed FFPE samples from mouse spleen and liver or human breast cancer up to six years old, together with their corresponding fresh frozen tissue. We then combined the PAT-H-MS approach with a histone-focused version of the super-SILAC strategy-using a mix of histones from four breast cancer cell lines as a spike-in standard- to accurately quantify hPTMs from breast cancer specimens belonging to different subtypes. The data, which are associated with a recent publication (Pathology tissue-quantitative mass spectrometry analysis to profile histone post-translational modification patterns in patient samples (Noberini, 2015 [1], are deposited at the ProteomeXchange Consortium via the PRIDE partner repository with the dataset identifier http://www.ebi.ac.uk/pride/archive/projects/PXD002669.

  19. Translating Developmental Origins: Improving the Health of Women and Their Children Using a Sustainable Approach to Behaviour Change

    Directory of Open Access Journals (Sweden)

    Mary Barker

    2017-03-01

    Full Text Available Theories of the developmental origins of health and disease imply that optimising the growth and development of babies is an essential route to improving the health of populations. A key factor in the growth of babies is the nutritional status of their mothers. Since women from more disadvantaged backgrounds have poorer quality diets and the worst pregnancy outcomes, they need to be a particular focus. The behavioural sciences have made a substantial contribution to the development of interventions to support dietary changes in disadvantaged women. Translation of such interventions into routine practice is an ideal that is rarely achieved, however. This paper illustrates how re-orientating health and social care services towards an empowerment approach to behaviour change might underpin a new developmental focus to improving long-term health, using learning from a community-based intervention to improve the diets and lifestyles of disadvantaged women. The Southampton Initiative for Health aimed to improve the diets and lifestyles of women of child-bearing age through training health and social care practitioners in skills to support behaviour change. Analysis illustrates the necessary steps in mounting such an intervention: building trust; matching agendas and changing culture. The Southampton Initiative for Health demonstrates that developing sustainable; workable interventions and effective community partnerships; requires commitment beginning long before intervention delivery but is key to the translation of developmental origins research into improvements in human health.

  20. Developing an evidence-based approach to Public Health Nutrition: translating evidence into policy.

    Science.gov (United States)

    Margetts, B; Warm, D; Yngve, A; Sjöström, M

    2001-12-01

    The aim of this paper is to highlight the importance of an evidence-based approach to the development, implementation and evaluation of policies aimed at improving nutrition-related health in the population. Public Health Nutrition was established to realise a population-level approach to the prevention of the major nutrition-related health problems world-wide. The scope is broad and integrates activity from local, national, regional and international levels. The aim is to inform and develop coherent and effective policies that address the key rate-limiting steps critical to improving nutrition-related public health. This paper sets out the rationale for an evidence-based approach to Public Health Nutrition developed under the umbrella of the European Network for Public Health Nutrition.

  1. Translational genomics

    Directory of Open Access Journals (Sweden)

    Martin Kussmann

    2014-09-01

    Full Text Available The term “Translational Genomics” reflects both title and mission of this new journal. “Translational” has traditionally been understood as “applied research” or “development”, different from or even opposed to “basic research”. Recent scientific and societal developments have triggered a re-assessment of the connotation that “translational” and “basic” are either/or activities: translational research nowadays aims at feeding the best science into applications and solutions for human society. We therefore argue here basic science to be challenged and leveraged for its relevance to human health and societal benefits. This more recent approach and attitude are catalyzed by four trends or developments: evidence-based solutions; large-scale, high dimensional data; consumer/patient empowerment; and systems-level understanding.

  2. Machine Learning Approach to Deconvolution of Thermal Infrared (TIR) Spectrum of Mercury Supporting MERTIS Onboard ESA/JAXA BepiColombo

    Science.gov (United States)

    Varatharajan, I.; D'Amore, M.; Maturilli, A.; Helbert, J.; Hiesinger, H.

    2018-04-01

    Machine learning approach to spectral unmixing of emissivity spectra of Mercury is carried out using endmember spectral library measured at simulated daytime surface conditions of Mercury. Study supports MERTIS payload onboard ESA/JAXA BepiColombo.

  3. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  4. Translating Quality in Higher Education: US Approaches to Accreditation of Institutions from around the World

    Science.gov (United States)

    Blanco Ramírez, Gerardo

    2015-01-01

    This article reports on findings from a sociolinguistic qualitative study exploring inter-discursive relations manifested in the approaches and strategies that regional accrediting agencies in the United States utilise when recognising foreign universities. Even as most countries have developed national quality assurance systems and whilst…

  5. Translating Basic Behavioral and Social Science Research to Clinical Application: The EVOLVE Mixed Methods Approach

    Science.gov (United States)

    Peterson, Janey C.; Czajkowski, Susan; Charlson, Mary E.; Link, Alissa R.; Wells, Martin T.; Isen, Alice M.; Mancuso, Carol A.; Allegrante, John P.; Boutin-Foster, Carla; Ogedegbe, Gbenga; Jobe, Jared B.

    2013-01-01

    Objective: To describe a mixed-methods approach to develop and test a basic behavioral science-informed intervention to motivate behavior change in 3 high-risk clinical populations. Our theoretically derived intervention comprised a combination of positive affect and self-affirmation (PA/SA), which we applied to 3 clinical chronic disease…

  6. Effective knowledge translation approaches and practices in Indigenous health research: a systematic review protocol

    Directory of Open Access Journals (Sweden)

    Melody E. Morton Ninomiya

    2017-02-01

    Full Text Available Abstract Background Effective knowledge translation (KT is critical to implementing program and policy changes that require shared understandings of knowledge systems, assumptions, and practices. Within mainstream research institutions and funding agencies, systemic and insidious inequities, privileges, and power relationships inhibit Indigenous peoples’ control, input, and benefits over research. This systematic review will examine literature on KT initiatives in Indigenous health research to help identify wise and promising Indigenous KT practices and language in Canada and abroad. Methods Indexed databases including Aboriginal Health Abstract Database, Bibliography of Native North Americans, CINAHL, Circumpolar Health Bibliographic Database, Dissertation Abstracts, First Nations Periodical Index, Medline, National Indigenous Studies Portal, ProQuest Conference Papers Index, PsycInfo, Social Services Abstracts, Social Work Abstracts, and Web of Science will be searched. A comprehensive list of non-indexed and grey literature sources will also be searched. For inclusion, documents must be published in English; linked to Indigenous health and wellbeing; focused on Indigenous people; document KT goals, activities, and rationale; and include an evaluation of their KT strategy. Identified quantitative, qualitative, and mixed methods’ studies that meet the inclusion criteria will then be appraised using a quality appraisal tool for research with Indigenous people. Studies that score 6 or higher on the quality appraisal tool will be included for analysis. Discussion This unique systematic review involves robust Indigenous community engagement strategies throughout the life of the project, starting with the development of the review protocol. The review is being guided by senior Indigenous researchers who will purposefully include literature sources characterized by Indigenous authorship, community engagement, and representation; screen and

  7. PredPsych: A toolbox for predictive machine learning based approach in experimental psychology research

    OpenAIRE

    Cavallo, Andrea; Becchio, Cristina; Koul, Atesh

    2016-01-01

    Recent years have seen an increased interest in machine learning based predictive methods for analysing quantitative behavioural data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible software framework. The goal of this work was to build an open-source toolbox – “PredPsych” – that could make these methods readily available to all psychologists. PredPsych is a...

  8. Poster abstract: A machine learning approach for vehicle classification using passive infrared and ultrasonic sensors

    KAUST Repository

    Warriach, Ehsan Ullah

    2013-01-01

    This article describes the implementation of four different machine learning techniques for vehicle classification in a dual ultrasonic/passive infrared traffic flow sensors. Using k-NN, Naive Bayes, SVM and KNN-SVM algorithms, we show that KNN-SVM significantly outperforms other algorithms in terms of classification accuracy. We also show that some of these algorithms could run in real time on the prototype system. Copyright © 2013 ACM.

  9. Machine Learning Approach to Optimizing Combined Stimulation and Medication Therapies for Parkinson's Disease.

    Science.gov (United States)

    Shamir, Reuben R; Dolber, Trygve; Noecker, Angela M; Walter, Benjamin L; McIntyre, Cameron C

    2015-01-01

    Deep brain stimulation (DBS) of the subthalamic region is an established therapy for advanced Parkinson's disease (PD). However, patients often require time-intensive post-operative management to balance their coupled stimulation and medication treatments. Given the large and complex parameter space associated with this task, we propose that clinical decision support systems (CDSS) based on machine learning algorithms could assist in treatment optimization. Develop a proof-of-concept implementation of a CDSS that incorporates patient-specific details on both stimulation and medication. Clinical data from 10 patients, and 89 post-DBS surgery visits, were used to create a prototype CDSS. The system was designed to provide three key functions: (1) information retrieval; (2) visualization of treatment, and; (3) recommendation on expected effective stimulation and drug dosages, based on three machine learning methods that included support vector machines, Naïve Bayes, and random forest. Measures of medication dosages, time factors, and symptom-specific pre-operative response to levodopa were significantly correlated with post-operative outcomes (P < 0.05) and their effect on outcomes was of similar magnitude to that of DBS. Using those results, the combined machine learning algorithms were able to accurately predict 86% (12/14) of the motor improvement scores at one year after surgery. Using patient-specific details, an appropriately parameterized CDSS could help select theoretically optimal DBS parameter settings and medication dosages that have potential to improve the clinical management of PD patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Machine Learning Approaches for Predicting Radiation Therapy Outcomes: A Clinician's Perspective.

    Science.gov (United States)

    Kang, John; Schwartz, Russell; Flickinger, John; Beriwal, Sushil

    2015-12-01

    Radiation oncology has always been deeply rooted in modeling, from the early days of isoeffect curves to the contemporary Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) initiative. In recent years, medical modeling for both prognostic and therapeutic purposes has exploded thanks to increasing availability of electronic data and genomics. One promising direction that medical modeling is moving toward is adopting the same machine learning methods used by companies such as Google and Facebook to combat disease. Broadly defined, machine learning is a branch of computer science that deals with making predictions from complex data through statistical models. These methods serve to uncover patterns in data and are actively used in areas such as speech recognition, handwriting recognition, face recognition, "spam" filtering (junk email), and targeted advertising. Although multiple radiation oncology research groups have shown the value of applied machine learning (ML), clinical adoption has been slow due to the high barrier to understanding these complex models by clinicians. Here, we present a review of the use of ML to predict radiation therapy outcomes from the clinician's point of view with the hope that it lowers the "barrier to entry" for those without formal training in ML. We begin by describing 7 principles that one should consider when evaluating (or creating) an ML model in radiation oncology. We next introduce 3 popular ML methods--logistic regression (LR), support vector machine (SVM), and artificial neural network (ANN)--and critique 3 seminal papers in the context of these principles. Although current studies are in exploratory stages, the overall methodology has progressively matured, and the field is ready for larger-scale further investigation. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Trip time prediction in mass transit companies. A machine learning approach

    OpenAIRE

    João M. Moreira; Alípio Jorge; Jorge Freire de Sousa; Carlos Soares

    2005-01-01

    In this paper we discuss how trip time prediction can be useful foroperational optimization in mass transit companies and which machine learningtechniques can be used to improve results. Firstly, we analyze which departmentsneed trip time prediction and when. Secondly, we review related work and thirdlywe present the analysis of trip time over a particular path. We proceed by presentingexperimental results conducted on real data with the forecasting techniques wefound most adequate, and concl...

  12. Machine Learning Approaches for Predicting Radiation Therapy Outcomes: A Clinician's Perspective

    International Nuclear Information System (INIS)

    Kang, John; Schwartz, Russell; Flickinger, John; Beriwal, Sushil

    2015-01-01

    Radiation oncology has always been deeply rooted in modeling, from the early days of isoeffect curves to the contemporary Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) initiative. In recent years, medical modeling for both prognostic and therapeutic purposes has exploded thanks to increasing availability of electronic data and genomics. One promising direction that medical modeling is moving toward is adopting the same machine learning methods used by companies such as Google and Facebook to combat disease. Broadly defined, machine learning is a branch of computer science that deals with making predictions from complex data through statistical models. These methods serve to uncover patterns in data and are actively used in areas such as speech recognition, handwriting recognition, face recognition, “spam” filtering (junk email), and targeted advertising. Although multiple radiation oncology research groups have shown the value of applied machine learning (ML), clinical adoption has been slow due to the high barrier to understanding these complex models by clinicians. Here, we present a review of the use of ML to predict radiation therapy outcomes from the clinician's point of view with the hope that it lowers the “barrier to entry” for those without formal training in ML. We begin by describing 7 principles that one should consider when evaluating (or creating) an ML model in radiation oncology. We next introduce 3 popular ML methods—logistic regression (LR), support vector machine (SVM), and artificial neural network (ANN)—and critique 3 seminal papers in the context of these principles. Although current studies are in exploratory stages, the overall methodology has progressively matured, and the field is ready for larger-scale further investigation.

  13. Machine Learning Approaches for Predicting Radiation Therapy Outcomes: A Clinician's Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Kang, John [Medical Scientist Training Program, University of Pittsburgh-Carnegie Mellon University, Pittsburgh, Pennsylvania (United States); Schwartz, Russell [Department of Biological Sciences, Carnegie Mellon University, Pittsburgh, Pennsylvania (United States); Flickinger, John [Departments of Radiation Oncology and Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania (United States); Beriwal, Sushil, E-mail: beriwals@upmc.edu [Department of Radiation Oncology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania (United States)

    2015-12-01

    Radiation oncology has always been deeply rooted in modeling, from the early days of isoeffect curves to the contemporary Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) initiative. In recent years, medical modeling for both prognostic and therapeutic purposes has exploded thanks to increasing availability of electronic data and genomics. One promising direction that medical modeling is moving toward is adopting the same machine learning methods used by companies such as Google and Facebook to combat disease. Broadly defined, machine learning is a branch of computer science that deals with making predictions from complex data through statistical models. These methods serve to uncover patterns in data and are actively used in areas such as speech recognition, handwriting recognition, face recognition, “spam” filtering (junk email), and targeted advertising. Although multiple radiation oncology research groups have shown the value of applied machine learning (ML), clinical adoption has been slow due to the high barrier to understanding these complex models by clinicians. Here, we present a review of the use of ML to predict radiation therapy outcomes from the clinician's point of view with the hope that it lowers the “barrier to entry” for those without formal training in ML. We begin by describing 7 principles that one should consider when evaluating (or creating) an ML model in radiation oncology. We next introduce 3 popular ML methods—logistic regression (LR), support vector machine (SVM), and artificial neural network (ANN)—and critique 3 seminal papers in the context of these principles. Although current studies are in exploratory stages, the overall methodology has progressively matured, and the field is ready for larger-scale further investigation.

  14. A Machine-Learning Approach to Predict Main Energy Consumption under Realistic Operational Conditions

    DEFF Research Database (Denmark)

    Petersen, Joan P; Winther, Ole; Jacobsen, Daniel J

    2012-01-01

    The paper presents a novel and publicly available set of high-quality sensory data collected from a ferry over a period of two months and overviews exixting machine-learning methods for the prediction of main propulsion efficiency. Neural networks are applied on both real-time and predictive...... settings. Performance results for the real-time models are shown. The presented models were successfully developed in a trim optimisation application onboard a product tanker....

  15. Experts and Machines against Bullies: A Hybrid Approach to Detect Cyberbullies

    OpenAIRE

    Dadvar, M.; Trieschnigg, Rudolf Berend; de Jong, Franciska M.G.

    2014-01-01

    Cyberbullying is becoming a major concern in online environments with troubling consequences. However, most of the technical studies have focused on the detection of cyberbullying through identifying harassing comments rather than preventing the incidents by detecting the bullies. In this work we study the automatic detection of bully users on YouTube. We compare three types of automatic detection: an expert system, supervised machine learning models, and a hybrid type combining the two. All ...

  16. Translation in ESL Classes

    Directory of Open Access Journals (Sweden)

    Nagy Imola Katalin

    2015-12-01

    Full Text Available The problem of translation in foreign language classes cannot be dealt with unless we attempt to make an overview of what translation meant for language teaching in different periods of language pedagogy. From the translation-oriented grammar-translation method through the complete ban on translation and mother tongue during the times of the audio-lingual approaches, we have come today to reconsider the role and status of translation in ESL classes. This article attempts to advocate for translation as a useful ESL class activity, which can completely fulfil the requirements of communicativeness. We also attempt to identify some activities and games, which rely on translation in some books published in the 1990s and the 2000s.

  17. PredPsych: A toolbox for predictive machine learning-based approach in experimental psychology research.

    Science.gov (United States)

    Koul, Atesh; Becchio, Cristina; Cavallo, Andrea

    2017-12-12

    Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.

  18. Explosion Monitoring with Machine Learning: A LSTM Approach to Seismic Event Discrimination

    Science.gov (United States)

    Magana-Zook, S. A.; Ruppert, S. D.

    2017-12-01

    The streams of seismic data that analysts look at to discriminate natural from man- made events will soon grow from gigabytes of data per day to exponentially larger rates. This is an interesting problem as the requirement for real-time answers to questions of non-proliferation will remain the same, and the analyst pool cannot grow as fast as the data volume and velocity will. Machine learning is a tool that can solve the problem of seismic explosion monitoring at scale. Using machine learning, and Long Short-term Memory (LSTM) models in particular, analysts can become more efficient by focusing their attention on signals of interest. From a global dataset of earthquake and explosion events, a model was trained to recognize the different classes of events, given their spectrograms. Optimal recurrent node count and training iterations were found, and cross validation was performed to evaluate model performance. A 10-fold mean accuracy of 96.92% was achieved on a balanced dataset of 30,002 instances. Given that the model is 446.52 MB it can be used to simultaneously characterize all incoming signals by researchers looking at events in isolation on desktop machines, as well as at scale on all of the nodes of a real-time streaming platform. LLNL-ABS-735911

  19. A Finite State Machine Approach to Algorithmic Lateral Inhibition for Real-Time Motion Detection †

    Directory of Open Access Journals (Sweden)

    María T. López

    2018-05-01

    Full Text Available Many researchers have explored the relationship between recurrent neural networks and finite state machines. Finite state machines constitute the best-characterized computational model, whereas artificial neural networks have become a very successful tool for modeling and problem solving. The neurally-inspired lateral inhibition method, and its application to motion detection tasks, have been successfully implemented in recent years. In this paper, control knowledge of the algorithmic lateral inhibition (ALI method is described and applied by means of finite state machines, in which the state space is constituted from the set of distinguishable cases of accumulated charge in a local memory. The article describes an ALI implementation for a motion detection task. For the implementation, we have chosen to use one of the members of the 16-nm Kintex UltraScale+ family of Xilinx FPGAs. FPGAs provide the necessary accuracy, resolution, and precision to run neural algorithms alongside current sensor technologies. The results offered in this paper demonstrate that this implementation provides accurate object tracking performance on several datasets, obtaining a high F-score value (0.86 for the most complex sequence used. Moreover, it outperforms implementations of a complete ALI algorithm and a simplified version of the ALI algorithm—named “accumulative computation”—which was run about ten years ago, now reaching real-time processing times that were simply not achievable at that time for ALI.

  20. MULTIFUNCTION OF INTERNET IN TRANSLATION

    Directory of Open Access Journals (Sweden)

    Bayu Budiharjo

    2017-04-01

    Full Text Available Technology affects almost all areas, including translation. Many products of technology have made translational works easier, one of which is internet. Despite the wide use of internet, the potentials it has are sometimes unnoticed. While web-based dictionaries or thesaurus often serve as translators’ assistants and online Machine Translation issues become topics of many researches, other uses of internet related to translation may not be known by many. Internet can help disseminate newborn ideas, theories and findings worldwide to enhance translation theories. Besides, the contact between internet and translation generates new areas to examine. Internet also provides helping hand in the area of translation research. Researcher or anyone conducting research in the field of translation can find a range of research gaps as well as reference. Those who need group discussions to collect required data from informants, or researchers of the same interest coming from all over the world can meet and conduct Focus Group Discussion (FGD on virtual world. Furthermore, internet offers various forms of assistance for translation practitioners. The commonly used internet assistance consists of dictionaries, thesaurus and Machine Translations available on the internet. Other forms of aid provided by internet take form of parallel texts, images, and videos, which can be very helpful. Internet provides many things which can be utilized for the purpose of translation. Internet keeps on providing more as it develops from time to time in line with the development of technology. Internet awaits utilization of theorists, researchers, practitioners and those having concern on translation.

  1. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches.

    Science.gov (United States)

    Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M

    2009-11-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.

  2. Translation procedures for standardised quality of life questionnaires: The European Organisation for Research and Treatment of Cancer (EORTC) approach.

    Science.gov (United States)

    Koller, Michael; Aaronson, Neil K; Blazeby, Jane; Bottomley, Andrew; Dewolf, Linda; Fayers, Peter; Johnson, Colin; Ramage, John; Scott, Neil; West, Karen

    2007-08-01

    The European Organisation for Research and Treatment of Cancer quality of life (EORTC QL) questionnaires are used in international trials and therefore standardised translation procedures are required. This report summarises the EORTC translation procedure, recent accomplishments and challenges. Translations follow a forward-backward procedure, independently carried out by two native-speakers of the target language. Discrepancies are arbitrated by a third consultant, and solutions are reached by consensus. Translated questionnaires undergo a pilot-testing. Suggestions are incorporated into the final questionnaire. Requests for translations originate from the module developers, physicians or pharmaceutical industry, and most translations are performed by professional translators. The translation procedure is managed and supervised by a Translation Coordinator within the EORTC QL Unit in Brussels. To date, the EORTC QLQ-C30 has been translated and validated into more than 60 languages, with further translations in progress. Translations include all major Western, and many African and Asian languages. The following translation problems were encountered: lack of expressions for specific symptoms in various languages, the use of old-fashioned language, recent spelling reforms in several European countries and different priorities of social issues between Western and Eastern cultures. The EORTC measurement system is now registered for use in over 9000 clinical trials worldwide. The EORTC provides strong infrastructure and quality control to produce robust translated questionnaires. Nevertheless, translation problems have been identified. The key to improvements may lie in the particular features and strengths of the group, consisting of researchers from 21 countries representing 25 languages and include the development of simple source versions, the use of advanced computerised tools, rigorous pilot-testing, certification procedures and insights from a unique cross

  3. Understanding Translation

    DEFF Research Database (Denmark)

    Schjoldager, Anne Gram; Gottlieb, Henrik; Klitgård, Ida

    Understanding Translation is designed as a textbook for courses on the theory and practice of translation in general and of particular types of translation - such as interpreting, screen translation and literary translation. The aim of the book is to help you gain an in-depth understanding...... of the phenomenon of translation and to provide you with a conceptual framework for the analysis of various aspects of professional translation. Intended readers are students of translation and languages, but the book will also be relevant for others who are interested in the theory and practice of translation...... - translators, language teachers, translation users and literary, TV and film critics, for instance. Discussions focus on translation between Danish and English....

  4. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  5. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: The Henry Ford ExercIse Testing (FIT) project.

    Science.gov (United States)

    Alghamdi, Manal; Al-Mallah, Mouaz; Keteyian, Steven; Brawner, Clinton; Ehrman, Jonathan; Sakr, Sherif

    2017-01-01

    Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE). The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree) and achieved high accuracy of prediction (AUC = 0.92). The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.

  6. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: The Henry Ford ExercIse Testing (FIT project.

    Directory of Open Access Journals (Sweden)

    Manal Alghamdi

    Full Text Available Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE. The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree and achieved high accuracy of prediction (AUC = 0.92. The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.

  7. An effective secondary decomposition approach for wind power forecasting using extreme learning machine trained by crisscross optimization

    International Nuclear Information System (INIS)

    Yin, Hao; Dong, Zhen; Chen, Yunlong; Ge, Jiafei; Lai, Loi Lei; Vaccaro, Alfredo; Meng, Anbo

    2017-01-01

    Highlights: • A secondary decomposition approach is applied in the data pre-processing. • The empirical mode decomposition is used to decompose the original time series. • IMF1 continues to be decomposed by applying wavelet packet decomposition. • Crisscross optimization algorithm is applied to train extreme learning machine. • The proposed SHD-CSO-ELM outperforms other pervious methods in the literature. - Abstract: Large-scale integration of wind energy into electric grid is restricted by its inherent intermittence and volatility. So the increased utilization of wind power necessitates its accurate prediction. The contribution of this study is to develop a new hybrid forecasting model for the short-term wind power prediction by using a secondary hybrid decomposition approach. In the data pre-processing phase, the empirical mode decomposition is used to decompose the original time series into several intrinsic mode functions (IMFs). A unique feature is that the generated IMF1 continues to be decomposed into appropriate and detailed components by applying wavelet packet decomposition. In the training phase, all the transformed sub-series are forecasted with extreme learning machine trained by our recently developed crisscross optimization algorithm (CSO). The final predicted values are obtained from aggregation. The results show that: (a) The performance of empirical mode decomposition can be significantly improved with its IMF1 decomposed by wavelet packet decomposition. (b) The CSO algorithm has satisfactory performance in addressing the premature convergence problem when applied to optimize extreme learning machine. (c) The proposed approach has great advantage over other previous hybrid models in terms of prediction accuracy.

  8. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L.; Batistoni, P.; Migliori, S. [Associazione EURATOM ENEA sulla Fusione, Frascati (Roma) (Italy); Chen, Y.; Fischer, U.; Pereslavtsev, P. [Association FZK-EURATOM Forschungszentrum Karlsruhe (Germany); Loughlin, M. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX (United Kingdom); Secco, A. [Nice Srl Via Serra 33 Camerano Casasco AT (Italy)

    2003-07-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  9. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    International Nuclear Information System (INIS)

    Petrizzi, L.; Batistoni, P.; Migliori, S.; Chen, Y.; Fischer, U.; Pereslavtsev, P.; Loughlin, M.; Secco, A.

    2003-01-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  10. Translation Techniques

    OpenAIRE

    Marcia Pinheiro

    2015-01-01

    In this paper, we discuss three translation techniques: literal, cultural, and artistic. Literal translation is a well-known technique, which means that it is quite easy to find sources on the topic. Cultural and artistic translation may be new terms. Whilst cultural translation focuses on matching contexts, artistic translation focuses on matching reactions. Because literal translation matches only words, it is not hard to find situations in which we should not use this technique.  Because a...

  11. A least square support vector machine-based approach for contingency classification and ranking in a large power system

    Directory of Open Access Journals (Sweden)

    Bhanu Pratap Soni

    2016-12-01

    Full Text Available This paper proposes an effective supervised learning approach for static security assessment of a large power system. Supervised learning approach employs least square support vector machine (LS-SVM to rank the contingencies and predict the system severity level. The severity of the contingency is measured by two scalar performance indices (PIs: line MVA performance index (PIMVA and Voltage-reactive power performance index (PIVQ. SVM works in two steps. Step I is the estimation of both standard indices (PIMVA and PIVQ that is carried out under different operating scenarios and Step II contingency ranking is carried out based on the values of PIs. The effectiveness of the proposed methodology is demonstrated on IEEE 39-bus (New England system. The approach can be beneficial tool which is less time consuming and accurate security assessment and contingency analysis at energy management center.

  12. Sound Effects in Translation

    DEFF Research Database (Denmark)

    Mees, Inger M.; Dragsted, Barbara; Gorm Hansen, Inge

    2013-01-01

    On the basis of a pilot study using speech recognition (SR) software, this paper attempts to illustrate the benefits of adopting an interdisciplinary approach in translator training. It shows how the collaboration between phoneticians, translators and interpreters can (1) advance research, (2) have......), Translog was employed to measure task times. The quality of the products was assessed by three experienced translators, and the number and types of misrecognitions were identified by a phonetician. Results indicate that SR translation provides a potentially useful supplement to written translation...

  13. Icing Detection over East Asia from Geostationary Satellite Data Using Machine Learning Approaches

    Directory of Open Access Journals (Sweden)

    Seongmun Sim

    2018-04-01

    Full Text Available Even though deicing or airframe coating technologies continue to develop, aircraft icing is still one of the critical threats to aviation. While the detection of potential icing clouds has been conducted using geostationary satellite data in the US and Europe, there is not yet a robust model that detects potential icing areas in East Asia. In this study, we proposed machine-learning-based icing detection models using data from two geostationary satellites—the Communication, Ocean, and Meteorological Satellite (COMS Meteorological Imager (MI and the Himawari-8 Advanced Himawari Imager (AHI—over Northeast Asia. Two machine learning techniques—random forest (RF and multinomial log-linear (MLL models—were evaluated with quality-controlled pilot reports (PIREPs as the reference data. The machine-learning-based models were compared to the existing models through five-fold cross-validation. The RF model for COMS MI produced the best performance, resulting in a mean probability of detection (POD of 81.8%, a mean overall accuracy (OA of 82.1%, and mean true skill statistics (TSS of 64.0%. One of the existing models, flight icing threat (FIT, produced relatively poor performance, providing a mean POD of 36.4%, a mean OA of 61.0, and a mean TSS of 9.7%. The Himawari-8 based models also produced performance comparable to the COMS models. However, it should be noted that very limited PIREP reference data were available especially for the Himawari-8 models, which requires further evaluation in the future with more reference data. The spatio-temporal patterns of the icing areas detected using the developed models were also visually examined using time-series satellite data.

  14. Classification of suicide attempters in schizophrenia using sociocultural and clinical features: A machine learning approach.

    Science.gov (United States)

    Hettige, Nuwan C; Nguyen, Thai Binh; Yuan, Chen; Rajakulendran, Thanara; Baddour, Jermeen; Bhagwat, Nikhil; Bani-Fatemi, Ali; Voineskos, Aristotle N; Mallar Chakravarty, M; De Luca, Vincenzo

    2017-07-01

    Suicide is a major concern for those afflicted by schizophrenia. Identifying patients at the highest risk for future suicide attempts remains a complex problem for psychiatric interventions. Machine learning models allow for the integration of many risk factors in order to build an algorithm that predicts which patients are likely to attempt suicide. Currently it is unclear how to integrate previously identified risk factors into a clinically relevant predictive tool to estimate the probability of a patient with schizophrenia for attempting suicide. We conducted a cross-sectional assessment on a sample of 345 participants diagnosed with schizophrenia spectrum disorders. Suicide attempters and non-attempters were clearly identified using the Columbia Suicide Severity Rating Scale (C-SSRS) and the Beck Suicide Ideation Scale (BSS). We developed four classification algorithms using a regularized regression, random forest, elastic net and support vector machine models with sociocultural and clinical variables as features to train the models. All classification models performed similarly in identifying suicide attempters and non-attempters. Our regularized logistic regression model demonstrated an accuracy of 67% and an area under the curve (AUC) of 0.71, while the random forest model demonstrated 66% accuracy and an AUC of 0.67. Support vector classifier (SVC) model demonstrated an accuracy of 67% and an AUC of 0.70, and the elastic net model demonstrated and accuracy of 65% and an AUC of 0.71. Machine learning algorithms offer a relatively successful method for incorporating many clinical features to predict individuals at risk for future suicide attempts. Increased performance of these models using clinically relevant variables offers the potential to facilitate early treatment and intervention to prevent future suicide attempts. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Multisensory-Based Rehabilitation Approach: Translational Insights from Animal Models to Early Intervention

    Directory of Open Access Journals (Sweden)

    Giulia Purpura

    2017-07-01

    Full Text Available Multisensory processes permit combinations of several inputs, coming from different sensory systems, allowing for a coherent representation of biological events and facilitating adaptation to environment. For these reasons, their application in neurological and neuropsychological rehabilitation has been enhanced in the last decades. Recent studies on animals and human models have indicated that, on one hand multisensory integration matures gradually during post-natal life and development is closely linked to environment and experience and, on the other hand, that modality-specific information seems to do not benefit by redundancy across multiple sense modalities and is more readily perceived in unimodal than in multimodal stimulation. In this review, multisensory process development is analyzed, highlighting clinical effects in animal and human models of its manipulation for rehabilitation of sensory disorders. In addition, new methods of early intervention based on multisensory-based rehabilitation approach and their applications on different infant populations at risk of neurodevelopmental disabilities are discussed.

  16. Impact of Health Care Employees’ Job Satisfaction on Organizational Performance Support Vector Machine Approach

    Directory of Open Access Journals (Sweden)

    CEMIL KUZEY

    2018-01-01

    Full Text Available This study is undertaken to search for key factors that contribute to job satisfaction among health care workers, and also to determine the impact of these underlying dimensions of employee satisfaction on organizational performance. Exploratory Factor Analysis (EFA is applied to initially uncover the key factors, and then, in the next stage of analysis, a popular data mining technique, Support Vector Machine (SVM is employed on a sample of 249 to determine the impact of job satisfaction factors on organizational performance. According to the proposed model, the main factors are revealed to be management’s attitude, pay/reward, job security and colleagues.

  17. Complete clinical responses to cancer therapy caused by multiple divergent approaches: a repeating theme lost in translation

    Directory of Open Access Journals (Sweden)

    Coventry BJ

    2012-05-01

    Full Text Available Brendon J Coventry, Martin L AshdownDiscipline of Surgery, University of Adelaide, Royal Adelaide Hospital and Faculty of Medicine, University of Melbourne, AustraliaAbstract: Over 50 years of cancer therapy history reveals complete clinical responses (CRs from remarkably divergent forms of therapies (eg, chemotherapy, radiotherapy, surgery, vaccines, autologous cell transfers, cytokines, monoclonal antibodies for advanced solid malignancies occur with an approximately similar frequency of 5%–10%. This has remained frustratingly almost static. However, CRs usually underpin strong durable 5-year patient survival. How can this apparent paradox be explained?Over some 20 years, realization that (1 chronic inflammation is intricately associated with cancer, and (2 the immune system is delicately balanced between responsiveness and tolerance of cancer, provides a greatly significant insight into ways cancer might be more effectively treated. In this review, divergent aspects from the largely segmented literature and recent conferences are drawn together to provide observations revealing some emerging reasoning, in terms of "final common pathways" of cancer cell damage, immune stimulation, and auto-vaccination events, ultimately leading to cancer cell destruction. Created from this is a unifying overarching concept to explain why multiple approaches to cancer therapy can provide complete responses at almost equivalent rates. This "missing" aspect provides a reasoned explanation for what has, and is being, increasingly reported in the mainstream literature – that inflammatory and immune responses appear intricately associated with, if not causative of, complete responses induced by divergent forms of cancer therapy. Curiously, whether by chemotherapy, radiation, surgery, or other means, therapy-induced cell injury results, leaving inflammation and immune system stimulation as a final common denominator across all of these mechanisms of cancer

  18. An efficient approach for improving virtual machine placement in cloud computing environment

    Science.gov (United States)

    Ghobaei-Arani, Mostafa; Shamsi, Mahboubeh; Rahmanian, Ali A.

    2017-11-01

    The ever increasing demand for the cloud services requires more data centres. The power consumption in the data centres is a challenging problem for cloud computing, which has not been considered properly by the data centre developer companies. Especially, large data centres struggle with the power cost and the Greenhouse gases production. Hence, employing the power efficient mechanisms are necessary to optimise the mentioned effects. Moreover, virtual machine (VM) placement can be used as an effective method to reduce the power consumption in data centres. In this paper by grouping both virtual and physical machines, and taking into account the maximum absolute deviation during the VM placement, the power consumption as well as the service level agreement (SLA) deviation in data centres are reduced. To this end, the best-fit decreasing algorithm is utilised in the simulation to reduce the power consumption by about 5% compared to the modified best-fit decreasing algorithm, and at the same time, the SLA violation is improved by 6%. Finally, the learning automata are used to a trade-off between power consumption reduction from one side, and SLA violation percentage from the other side.

  19. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    Science.gov (United States)

    Chen, Chi-Jim; Pai, Tun-Wen; Cheng, Mox

    2015-01-01

    A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS), successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM), based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates. PMID:25835186

  20. A Machine Learning Approach for Hot-Spot Detection at Protein-Protein Interfaces

    Directory of Open Access Journals (Sweden)

    Rita Melo

    2016-07-01

    Full Text Available Understanding protein-protein interactions is a key challenge in biochemistry. In this work, we describe a more accurate methodology to predict Hot-Spots (HS in protein-protein interfaces from their native complex structure compared to previous published Machine Learning (ML techniques. Our model is trained on a large number of complexes and on a significantly larger number of different structural- and evolutionary sequence-based features. In particular, we added interface size, type of interaction between residues at the interface of the complex, number of different types of residues at the interface and the Position-Specific Scoring Matrix (PSSM, for a total of 79 features. We used twenty-seven algorithms from a simple linear-based function to support-vector machine models with different cost functions. The best model was achieved by the use of the conditional inference random forest (c-forest algorithm with a dataset pre-processed by the normalization of features and with up-sampling of the minor class. The method has an overall accuracy of 0.80, an F1-score of 0.73, a sensitivity of 0.76 and a specificity of 0.82 for the independent test set.