WorldWideScience

Sample records for knowledge-based machine indexing

  1. Knowledge-based machine indexing from natural language text: Knowledge base design, development, and maintenance

    Science.gov (United States)

    Genuardi, Michael T.

    1993-01-01

    One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.

  2. Machine intelligence and knowledge bases

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K

    1981-09-01

    The basic functions necessary in machine intelligence are a knowledge base and a logic programming language such as PROLOG using deductive reasoning. Recently inductive reasoning based on meta knowledge and default reasoning have been developed. The creative thought model of Lenit is reviewed and the concept of knowledge engineering is introduced. 17 references.

  3. The research on construction and application of machining process knowledge base

    Science.gov (United States)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  4. Development and evaluation of intelligent machine tools based on knowledge evolution in M2M environment

    International Nuclear Information System (INIS)

    Kim, Dong Hoon; Song, Jun Yeob; Lee, Jong Hyun; Cha, Suk Keun

    2009-01-01

    In the near future, the foreseen improvement in machine tools will be in the form of a knowledge evolution-based intelligent device. The goal of this study is to develop intelligent machine tools having knowledge-evolution capability in Machine to Machine (M2M) wired and wireless environment. The knowledge evolution-based intelligent machine tools are expected to be capable of gathering knowledge autonomously, producing knowledge, understanding knowledge, applying reasoning to knowledge, making new decisions, dialoguing with other machines, etc. The concept of the knowledge-evolution intelligent machine originated from the process of machine control operation by the sense, dialogue and decision of a human expert. The structure of knowledge evolution in M2M and the scheme for a dialogue agent among agent-based modules such as a sensory agent, a dialogue agent and an expert system (decision support agent) are presented in this paper, and work-offset compensation from thermal change and recommendation of cutting condition are performed on-line for knowledge-evolution verification

  5. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    Science.gov (United States)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  6. Three dimensional pattern recognition using feature-based indexing and rule-based search

    Science.gov (United States)

    Lee, Jae-Kyu

    In flexible automated manufacturing, robots can perform routine operations as well as recover from atypical events, provided that process-relevant information is available to the robot controller. Real time vision is among the most versatile sensing tools, yet the reliability of machine-based scene interpretation can be questionable. The effort described here is focused on the development of machine-based vision methods to support autonomous nuclear fuel manufacturing operations in hot cells. This thesis presents a method to efficiently recognize 3D objects from 2D images based on feature-based indexing. Object recognition is the identification of correspondences between parts of a current scene and stored views of known objects, using chains of segments or indexing vectors. To create indexed object models, characteristic model image features are extracted during preprocessing. Feature vectors representing model object contours are acquired from several points of view around each object and stored. Recognition is the process of matching stored views with features or patterns detected in a test scene. Two sets of algorithms were developed, one for preprocessing and indexed database creation, and one for pattern searching and matching during recognition. At recognition time, those indexing vectors with the highest match probability are retrieved from the model image database, using a nearest neighbor search algorithm. The nearest neighbor search predicts the best possible match candidates. Extended searches are guided by a search strategy that employs knowledge-base (KB) selection criteria. The knowledge-based system simplifies the recognition process and minimizes the number of iterations and memory usage. Novel contributions include the use of a feature-based indexing data structure together with a knowledge base. Both components improve the efficiency of the recognition process by improved structuring of the database of object features and reducing data base size

  7. Research on cylindrical indexing cam’s unilateral machining

    Directory of Open Access Journals (Sweden)

    Junhua Chen

    2015-08-01

    Full Text Available The cylindrical cam ridge of the indexer is a spatial curved surface, which is difficult to design and machine. The cylindrical cam has some defects after machining because conventional machining methods have inaccuracies. This article aims at proposing a precise way to machine an indexing cam, using basic motion analysis and analytic geometry approach. Analytical methodology is first applied in the cam’s motion analysis, to obtain an error-free cam follower’s trajectory formula, and then separate the continuous trajectory curve by thousandth resolution, to create a three-dimensional discrete trajectory curve. Planar formulae and spherical formulae can be built on the loci. Based on the machine principle, the cutting cutter’s position and orientation will be taken into account. This article calculates the formula set as presented previously and obtains the ultimate cutter path coordinate value. The new error-free cutter path trajectory is called the unilateral machining trajectory. The earned results will compile into numerical control processing schedule. This processing methodology gives a convenient and precision way to manufacture a cylindrical indexing cam. Experimental results are also well supported.

  8. Robust Visual Knowledge Transfer via Extreme Learning Machine Based Domain Adaptation.

    Science.gov (United States)

    Zhang, Lei; Zhang, David

    2016-08-10

    We address the problem of visual knowledge adaptation by leveraging labeled patterns from source domain and a very limited number of labeled instances in target domain to learn a robust classifier for visual categorization. This paper proposes a new extreme learning machine based cross-domain network learning framework, that is called Extreme Learning Machine (ELM) based Domain Adaptation (EDA). It allows us to learn a category transformation and an ELM classifier with random projection by minimizing the -norm of the network output weights and the learning error simultaneously. The unlabeled target data, as useful knowledge, is also integrated as a fidelity term to guarantee the stability during cross domain learning. It minimizes the matching error between the learned classifier and a base classifier, such that many existing classifiers can be readily incorporated as base classifiers. The network output weights cannot only be analytically determined, but also transferrable. Additionally, a manifold regularization with Laplacian graph is incorporated, such that it is beneficial to semi-supervised learning. Extensively, we also propose a model of multiple views, referred as MvEDA. Experiments on benchmark visual datasets for video event recognition and object recognition, demonstrate that our EDA methods outperform existing cross-domain learning methods.

  9. Novel nonlinear knowledge-based mean force potentials based on machine learning.

    Science.gov (United States)

    Dong, Qiwen; Zhou, Shuigeng

    2011-01-01

    The prediction of 3D structures of proteins from amino acid sequences is one of the most challenging problems in molecular biology. An essential task for solving this problem with coarse-grained models is to deduce effective interaction potentials. The development and evaluation of new energy functions is critical to accurately modeling the properties of biological macromolecules. Knowledge-based mean force potentials are derived from statistical analysis of proteins of known structures. Current knowledge-based potentials are almost in the form of weighted linear sum of interaction pairs. In this study, a class of novel nonlinear knowledge-based mean force potentials is presented. The potential parameters are obtained by nonlinear classifiers, instead of relative frequencies of interaction pairs against a reference state or linear classifiers. The support vector machine is used to derive the potential parameters on data sets that contain both native structures and decoy structures. Five knowledge-based mean force Boltzmann-based or linear potentials are introduced and their corresponding nonlinear potentials are implemented. They are the DIH potential (single-body residue-level Boltzmann-based potential), the DFIRE-SCM potential (two-body residue-level Boltzmann-based potential), the FS potential (two-body atom-level Boltzmann-based potential), the HR potential (two-body residue-level linear potential), and the T32S3 potential (two-body atom-level linear potential). Experiments are performed on well-established decoy sets, including the LKF data set, the CASP7 data set, and the Decoys “R”Us data set. The evaluation metrics include the energy Z score and the ability of each potential to discriminate native structures from a set of decoy structures. Experimental results show that all nonlinear potentials significantly outperform the corresponding Boltzmann-based or linear potentials, and the proposed discriminative framework is effective in developing knowledge-based

  10. Knowledge-based support for design and operational use of human-machine interfaces

    International Nuclear Information System (INIS)

    Johannsen, G.

    1994-01-01

    The possibilities for knowledge support of different human user classes, namely operators, operational engineers and designers of human-machine interfaces, are discussed. Several human-machine interface functionalities are briefly explained. The paper deals with such questions as which type of knowledge is needed for design and operation, how to represent it, where to get it from, how to process it, and how to consider and use it. The relationships between design and operational use are thereby emphasised. (author)

  11. MEDLINE MeSH Indexing: Lessons Learned from Machine Learning and Future Directions

    DEFF Research Database (Denmark)

    Jimeno-Yepes, Antonio; Mork, James G.; Wilkowski, Bartlomiej

    2012-01-01

    and analyzed the issues when using standard machine learning algorithms. We show that in some cases machine learning can improve the annotations already recommended by MTI, that machine learning based on low variance methods achieves better performance and that each MeSH heading presents a different behavior......Map and a k-NN approach called PubMed Related Citations (PRC). Our motivation is to improve the quality of MTI based on machine learning. Typical machine learning approaches fit this indexing task into text categorization. In this work, we have studied some Medical Subject Headings (MeSH) recommended by MTI...

  12. A SEMI-AUTOMATIC RULE SET BUILDING METHOD FOR URBAN LAND COVER CLASSIFICATION BASED ON MACHINE LEARNING AND HUMAN KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    H. Y. Gu

    2017-09-01

    Full Text Available Classification rule set is important for Land Cover classification, which refers to features and decision rules. The selection of features and decision are based on an iterative trial-and-error approach that is often utilized in GEOBIA, however, it is time-consuming and has a poor versatility. This study has put forward a rule set building method for Land cover classification based on human knowledge and machine learning. The use of machine learning is to build rule sets effectively which will overcome the iterative trial-and-error approach. The use of human knowledge is to solve the shortcomings of existing machine learning method on insufficient usage of prior knowledge, and improve the versatility of rule sets. A two-step workflow has been introduced, firstly, an initial rule is built based on Random Forest and CART decision tree. Secondly, the initial rule is analyzed and validated based on human knowledge, where we use statistical confidence interval to determine its threshold. The test site is located in Potsdam City. We utilised the TOP, DSM and ground truth data. The results show that the method could determine rule set for Land Cover classification semi-automatically, and there are static features for different land cover classes.

  13. Evaluation on knowledge extraction and machine learning in ...

    African Journals Online (AJOL)

    Evaluation on knowledge extraction and machine learning in resolving Malay word ambiguity. ... No 5S (2017) >. Log in or Register to get access to full text downloads. ... Keywords: ambiguity; lexical knowledge; machine learning; Malay word ...

  14. Development of a Late-Life Dementia Prediction Index with Supervised Machine Learning in the Population-Based CAIDE Study.

    Science.gov (United States)

    Pekkala, Timo; Hall, Anette; Lötjönen, Jyrki; Mattila, Jussi; Soininen, Hilkka; Ngandu, Tiia; Laatikainen, Tiina; Kivipelto, Miia; Solomon, Alina

    2017-01-01

    This study aimed to develop a late-life dementia prediction model using a novel validated supervised machine learning method, the Disease State Index (DSI), in the Finnish population-based CAIDE study. The CAIDE study was based on previous population-based midlife surveys. CAIDE participants were re-examined twice in late-life, and the first late-life re-examination was used as baseline for the present study. The main study population included 709 cognitively normal subjects at first re-examination who returned to the second re-examination up to 10 years later (incident dementia n = 39). An extended population (n = 1009, incident dementia 151) included non-participants/non-survivors (national registers data). DSI was used to develop a dementia index based on first re-examination assessments. Performance in predicting dementia was assessed as area under the ROC curve (AUC). AUCs for DSI were 0.79 and 0.75 for main and extended populations. Included predictors were cognition, vascular factors, age, subjective memory complaints, and APOE genotype. The supervised machine learning method performed well in identifying comprehensive profiles for predicting dementia development up to 10 years later. DSI could thus be useful for identifying individuals who are most at risk and may benefit from dementia prevention interventions.

  15. Machine learning versus knowledge based classification of legal texts

    NARCIS (Netherlands)

    de Maat, E.; Krabben, K.; Winkels, R.; Winkels, R.G.F.

    2010-01-01

    This paper presents results of an experiment in which we used machine learning (ML) techniques to classify sentences in Dutch legislation. These results are compared to the results of a pattern-based classifier. Overall, the ML classifier performs as accurate (>90%) as the pattern based one, but

  16. Virtual NC machine model with integrated knowledge data

    International Nuclear Information System (INIS)

    Sidorenko, Sofija; Dukovski, Vladimir

    2002-01-01

    The concept of virtual NC machining was established for providing a virtual product that could be compared with an appropriate designed product, in order to make NC program correctness evaluation, without real experiments. This concept is applied in the intelligent CAD/CAM system named VIRTUAL MANUFACTURE. This paper presents the first intelligent module that enables creation of the virtual models of existed NC machines and virtual creation of new ones, applying modular composition. Creation of a virtual NC machine is carried out via automatic knowledge data saving (features of the created NC machine). (Author)

  17. Structural Damage Detection using Frequency Response Function Index and Surrogate Model Based on Optimized Extreme Learning Machine Algorithm

    Directory of Open Access Journals (Sweden)

    R. Ghiasi

    2017-09-01

    Full Text Available Utilizing surrogate models based on artificial intelligence methods for detecting structural damages has attracted the attention of many researchers in recent decades. In this study, a new kernel based on Littlewood-Paley Wavelet (LPW is proposed for Extreme Learning Machine (ELM algorithm to improve the accuracy of detecting multiple damages in structural systems.  ELM is used as metamodel (surrogate model of exact finite element analysis of structures in order to efficiently reduce the computational cost through updating process. In the proposed two-step method, first a damage index, based on Frequency Response Function (FRF of the structure, is used to identify the location of damages. In the second step, the severity of damages in identified elements is detected using ELM. In order to evaluate the efficacy of ELM, the results obtained from the proposed kernel were compared with other kernels proposed for ELM as well as Least Square Support Vector Machine algorithm. The solved numerical problems indicated that ELM algorithm accuracy in detecting structural damages is increased drastically in case of using LPW kernel.

  18. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  19. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  20. Using Blood Indexes to Predict Overweight Statuses: An Extreme Learning Machine-Based Approach.

    Directory of Open Access Journals (Sweden)

    Huiling Chen

    Full Text Available The number of the overweight people continues to rise across the world. Studies have shown that being overweight can increase health risks, such as high blood pressure, diabetes mellitus, coronary heart disease, and certain forms of cancer. Therefore, identifying the overweight status in people is critical to prevent and decrease health risks. This study explores a new technique that uses blood and biochemical measurements to recognize the overweight condition. A new machine learning technique, an extreme learning machine, was developed to accurately detect the overweight status from a pool of 225 overweight and 251 healthy subjects. The group included 179 males and 297 females. The detection method was rigorously evaluated against the real-life dataset for accuracy, sensitivity, specificity, and AUC (area under the receiver operating characteristic (ROC curve criterion. Additionally, the feature selection was investigated to identify correlating factors for the overweight status. The results demonstrate that there are significant differences in blood and biochemical indexes between healthy and overweight people (p-value < 0.01. According to the feature selection, the most important correlated indexes are creatinine, hemoglobin, hematokrit, uric Acid, red blood cells, high density lipoprotein, alanine transaminase, triglyceride, and γ-glutamyl transpeptidase. These are consistent with the results of Spearman test analysis. The proposed method holds promise as a new, accurate method for identifying the overweight status in subjects.

  1. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    OpenAIRE

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S.; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality i...

  2. Machine Learning for Knowledge Extraction from PHR Big Data.

    Science.gov (United States)

    Poulymenopoulou, Michaela; Malamateniou, Flora; Vassilacopoulos, George

    2014-01-01

    Cloud computing, Internet of things (IOT) and NoSQL database technologies can support a new generation of cloud-based PHR services that contain heterogeneous (unstructured, semi-structured and structured) patient data (health, social and lifestyle) from various sources, including automatically transmitted data from Internet connected devices of patient living space (e.g. medical devices connected to patients at home care). The patient data stored in such PHR systems constitute big data whose analysis with the use of appropriate machine learning algorithms is expected to improve diagnosis and treatment accuracy, to cut healthcare costs and, hence, to improve the overall quality and efficiency of healthcare provided. This paper describes a health data analytics engine which uses machine learning algorithms for analyzing cloud based PHR big health data towards knowledge extraction to support better healthcare delivery as regards disease diagnosis and prognosis. This engine comprises of the data preparation, the model generation and the data analysis modules and runs on the cloud taking advantage from the map/reduce paradigm provided by Apache Hadoop.

  3. Connection machine: a computer architecture based on cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hillis, W D

    1984-01-01

    This paper describes the connection machine, a programmable computer based on cellular automata. The essential idea behind the connection machine is that a regular locally-connected cellular array can be made to behave as if the processing cells are connected into any desired topology. When the topology of the machine is chosen to match the topology of the application program, the result is a fast, powerful computing engine. The connection machine was originally designed to implement knowledge retrieval operations in artificial intelligence programs, but the hardware and the programming techniques are apparently applicable to a much larger class of problems. A machine with 100000 processing cells is currently being constructed. 27 references.

  4. DROUGHT FORECASTING BASED ON MACHINE LEARNING OF REMOTE SENSING AND LONG-RANGE FORECAST DATA

    Directory of Open Access Journals (Sweden)

    J. Rhee

    2016-06-01

    Full Text Available The reduction of drought impacts may be achieved through sustainable drought management and proactive measures against drought disaster. Accurate and timely provision of drought information is essential. In this study, drought forecasting models to provide high-resolution drought information based on drought indicators for ungauged areas were developed. The developed models predict drought indices of the 6-month Standardized Precipitation Index (SPI6 and the 6-month Standardized Precipitation Evapotranspiration Index (SPEI6. An interpolation method based on multiquadric spline interpolation method as well as three machine learning models were tested. Three machine learning models of Decision Tree, Random Forest, and Extremely Randomized Trees were tested to enhance the provision of drought initial conditions based on remote sensing data, since initial conditions is one of the most important factors for drought forecasting. Machine learning-based methods performed better than interpolation methods for both classification and regression, and the methods using climatology data outperformed the methods using long-range forecast. The model based on climatological data and the machine learning method outperformed overall.

  5. A knowledge-based diagnosis system for welding machine problem solving

    International Nuclear Information System (INIS)

    Bonnieres, P. de; Boutes, J.L.; Calas, M.A.; Para, S.

    1986-06-01

    This paper presents a knowledge-based diagnosis system which can be a valuable aid in resolving malfunctions and failures encountered using the automatic hot-wire TIG weld cladding process. This knowledge-based system is currently under evaluation by welding operators at the Framatome heavy fabricating facility. Extension to other welding processes is being considered

  6. The relationship between the dental health knowledge and oral hygiene index of the deaf

    Directory of Open Access Journals (Sweden)

    Lilis Nurliyanasari

    2009-07-01

    Full Text Available Oral hygiene index can be influenced by behaviour factor. Behavior has three domain consist of knowledge, attitude, and practice. Knowledge will change the behaviour of society which next affects to oral hygiene index. The purpose of the research was to know the relationship between the dental health knowledge and oral and dental hygiene index of the deaf. The research was analytic with the cross-sectional method on 63 subjects on 3,4,5 and 6 level class at hearing impaired in Magelang, obtained using the total sampling. Evaluation of dental health knowledge was viewed from the questionnaire. Oral Hygiene Index-Simplified by Green dan Vermillion used to measured oral hygiene index. The research result showed that 65.08%of the deaf on 3,4,5 and 6 level class at hearing impaired in Magelang was in the good category, OHI-S was in the moderate category. Based on Chi-square test there was no significant relationship between the dental health knowledge and oral hygiene index of the deaf at hearing impaired in Magelang.

  7. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    Science.gov (United States)

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  8. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  9. An expert system for vibration based diagnostics of rotating machines

    International Nuclear Information System (INIS)

    Korteniemi, A.

    1990-01-01

    Very often changes in the mechanical condition of the rotating machinery can be observed as changes in its vibration. This paper presents an expert system for vibration-based diagnosis of rotating machines by describing the architecture of the developed prototype system. The importance of modelling the problem solving knowledge as well as the domain knowledge is emphasized by presenting the knowledge in several levels

  10. A neurite quality index and machine vision software for improved quantification of neurodegeneration.

    Science.gov (United States)

    Romero, Peggy; Miller, Ted; Garakani, Arman

    2009-12-01

    Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.

  11. An Associative Index Model for the Results List Based on Vannevar Bush's Selection Concept

    Science.gov (United States)

    Cole, Charles; Julien, Charles-Antoine; Leide, John E.

    2010-01-01

    Introduction: We define the results list problem in information search and suggest the "associative index model", an ad-hoc, user-derived indexing solution based on Vannevar Bush's description of an associative indexing approach for his memex machine. We further define what selection means in indexing terms with reference to Charles…

  12. Quantum neural network based machine translator for Hindi to English.

    Science.gov (United States)

    Narayan, Ravi; Singh, V P; Chakraverty, S

    2014-01-01

    This paper presents the machine learning based machine translation system for Hindi to English, which learns the semantically correct corpus. The quantum neural based pattern recognizer is used to recognize and learn the pattern of corpus, using the information of part of speech of individual word in the corpus, like a human. The system performs the machine translation using its knowledge gained during the learning by inputting the pair of sentences of Devnagri-Hindi and English. To analyze the effectiveness of the proposed approach, 2600 sentences have been evaluated during simulation and evaluation. The accuracy achieved on BLEU score is 0.7502, on NIST score is 6.5773, on ROUGE-L score is 0.9233, and on METEOR score is 0.5456, which is significantly higher in comparison with Google Translation and Bing Translation for Hindi to English Machine Translation.

  13. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  14. Assessing Implicit Knowledge in BIM Models with Machine Learning

    DEFF Research Database (Denmark)

    Krijnen, Thomas; Tamke, Martin

    2015-01-01

    architects and engineers are able to deduce non-explicitly explicitly stated information, which is often the core of the transported architectural information. This paper investigates how machine learning approaches allow a computational system to deduce implicit knowledge from a set of BIM models....

  15. KNOWLEDGE-BASED OBJECT DETECTION IN LASER SCANNING POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    F. Boochs

    2012-07-01

    Full Text Available Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This “understanding” enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL, used for formulating the knowledge base and the Semantic Web Rule Language (SWRL with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists’ knowledge of the scene and algorithmic processing.

  16. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    Science.gov (United States)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  17. APPLICATION OF THE PERFORMANCE SELECTION INDEX METHOD FOR SOLVING MACHINING MCDM PROBLEMS

    Directory of Open Access Journals (Sweden)

    Dušan Petković

    2017-04-01

    Full Text Available Complex nature of machining processes requires the use of different methods and techniques for process optimization. Over the past few years a number of different optimization methods have been proposed for solving continuous machining optimization problems. In manufacturing environment, engineers are also facing a number of discrete machining optimization problems. In order to help decision makers in solving this type of optimization problems a number of multi criteria decision making (MCDM methods have been proposed. This paper introduces the use of an almost unexplored MCDM method, i.e. performance selection index (PSI method for solving machining MCDM problems. The main motivation for using the PSI method is that it is not necessary to determine criteria weights as in other MCDM methods. Applicability and effectiveness of the PSI method have been demonstrated while solving two case studies dealing with machinability of materials and selection of the most suitable cutting fluid for the given machining application. The obtained rankings have good correlation with those derived by the past researchers using other MCDM methods which validate the usefulness of this method for solving machining MCDM problems.

  18. KNOWLEDGE AND XML BASED CAPP SYSTEM

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shijie; SONG Laigang

    2006-01-01

    In order to enhance the intelligent level of system and improve the interactivity with other systems, a knowledge and XML based computer aided process planning (CAPP) system is implemented. It includes user management, bill of materials(BOM) management, knowledge based process planning, knowledge management and database maintaining sub-systems. This kind of nesting knowledge representation method the system provided can represent complicated arithmetic and logical relationship to deal with process planning tasks. With the representation and manipulation of XML based technological file, the system solves some important problems in web environment such as information interactive efficiency and refreshing of web page. The CAPP system is written in ASP VBScript, JavaScript, Visual C++ languages and Oracle database. At present, the CAPP system is running in Shenyang Machine Tools. The functions of it meet the requirements of enterprise production.

  19. Prediction of Baseflow Index of Catchments using Machine Learning Algorithms

    Science.gov (United States)

    Yadav, B.; Hatfield, K.

    2017-12-01

    We present the results of eight machine learning techniques for predicting the baseflow index (BFI) of ungauged basins using a surrogate of catchment scale climate and physiographic data. The tested algorithms include ordinary least squares, ridge regression, least absolute shrinkage and selection operator (lasso), elasticnet, support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Our work seeks to identify the dominant controls of BFI that can be readily obtained from ancillary geospatial databases and remote sensing measurements, such that the developed techniques can be extended to ungauged catchments. More than 800 gauged catchments spanning the continental United States were selected to develop the general methodology. The BFI calculation was based on the baseflow separated from daily streamflow hydrograph using HYSEP filter. The surrogate catchment attributes were compiled from multiple sources including digital elevation model, soil, landuse, climate data, other publicly available ancillary and geospatial data. 80% catchments were used to train the ML algorithms, and the remaining 20% of the catchments were used as an independent test set to measure the generalization performance of fitted models. A k-fold cross-validation using exhaustive grid search was used to fit the hyperparameters of each model. Initial model development was based on 19 independent variables, but after variable selection and feature ranking, we generated revised sparse models of BFI prediction that are based on only six catchment attributes. These key predictive variables selected after the careful evaluation of bias-variance tradeoff include average catchment elevation, slope, fraction of sand, permeability, temperature, and precipitation. The most promising algorithms exceeding an accuracy score (r-square) of 0.7 on test data include support vector machine, gradient boosted regression trees, random forests, and extremely randomized

  20. Predicting the Performance of Chain Saw Machines Based on Shore Scleroscope Hardness

    Science.gov (United States)

    Tumac, Deniz

    2014-03-01

    Shore hardness has been used to estimate several physical and mechanical properties of rocks over the last few decades. However, the number of researches correlating Shore hardness with rock cutting performance is quite limited. Also, rather limited researches have been carried out on predicting the performance of chain saw machines. This study differs from the previous investigations in the way that Shore hardness values (SH1, SH2, and deformation coefficient) are used to determine the field performance of chain saw machines. The measured Shore hardness values are correlated with the physical and mechanical properties of natural stone samples, cutting parameters (normal force, cutting force, and specific energy) obtained from linear cutting tests in unrelieved cutting mode, and areal net cutting rate of chain saw machines. Two empirical models developed previously are improved for the prediction of the areal net cutting rate of chain saw machines. The first model is based on a revised chain saw penetration index, which uses SH1, machine weight, and useful arm cutting depth as predictors. The second model is based on the power consumed for only cutting the stone, arm thickness, and specific energy as a function of the deformation coefficient. While cutting force has a strong relationship with Shore hardness values, the normal force has a weak or moderate correlation. Uniaxial compressive strength, Cerchar abrasivity index, and density can also be predicted by Shore hardness values.

  1. Lung cancer gene expression database analysis incorporating prior knowledge with support vector machine-based classification method

    Directory of Open Access Journals (Sweden)

    Huang Desheng

    2009-07-01

    Full Text Available Abstract Background A reliable and precise classification is essential for successful diagnosis and treatment of cancer. Gene expression microarrays have provided the high-throughput platform to discover genomic biomarkers for cancer diagnosis and prognosis. Rational use of the available bioinformation can not only effectively remove or suppress noise in gene chips, but also avoid one-sided results of separate experiment. However, only some studies have been aware of the importance of prior information in cancer classification. Methods Together with the application of support vector machine as the discriminant approach, we proposed one modified method that incorporated prior knowledge into cancer classification based on gene expression data to improve accuracy. A public well-known dataset, Malignant pleural mesothelioma and lung adenocarcinoma gene expression database, was used in this study. Prior knowledge is viewed here as a means of directing the classifier using known lung adenocarcinoma related genes. The procedures were performed by software R 2.80. Results The modified method performed better after incorporating prior knowledge. Accuracy of the modified method improved from 98.86% to 100% in training set and from 98.51% to 99.06% in test set. The standard deviations of the modified method decreased from 0.26% to 0 in training set and from 3.04% to 2.10% in test set. Conclusion The method that incorporates prior knowledge into discriminant analysis could effectively improve the capacity and reduce the impact of noise. This idea may have good future not only in practice but also in methodology.

  2. DNA-based machines.

    Science.gov (United States)

    Wang, Fuan; Willner, Bilha; Willner, Itamar

    2014-01-01

    The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.

  3. Short-Term Electricity-Load Forecasting Using a TSK-Based Extreme Learning Machine with Knowledge Representation

    Directory of Open Access Journals (Sweden)

    Chan-Uk Yeom

    2017-10-01

    Full Text Available This paper discusses short-term electricity-load forecasting using an extreme learning machine (ELM with automatic knowledge representation from a given input-output data set. For this purpose, we use a Takagi-Sugeno-Kang (TSK-based ELM to develop a systematic approach to generating if-then rules, while the conventional ELM operates without knowledge information. The TSK-ELM design includes a two-phase development. First, we generate an initial random-partition matrix and estimate cluster centers for random clustering. The obtained cluster centers are used to determine the premise parameters of fuzzy if-then rules. Next, the linear weights of the TSK fuzzy type are estimated using the least squares estimate (LSE method. These linear weights are used as the consequent parameters in the TSK-ELM design. The experiments were performed on short-term electricity-load data for forecasting. The electricity-load data were used to forecast hourly day-ahead loads given temperature forecasts; holiday information; and historical loads from the New England ISO. In order to quantify the performance of the forecaster, we use metrics and statistical characteristics such as root mean squared error (RMSE as well as mean absolute error (MAE, mean absolute percent error (MAPE, and R-squared, respectively. The experimental results revealed that the proposed method showed good performance when compared with a conventional ELM with four activation functions such sigmoid, sine, radial basis function, and rectified linear unit (ReLU. It possessed superior prediction performance and knowledge information and a small number of rules.

  4. Islanding Detection of Synchronous Machine-Based DGs using Average Frequency Based Index

    Directory of Open Access Journals (Sweden)

    M. Bakhshi

    2013-06-01

    Full Text Available Identification of intentional and unintentional islanding situations of dispersed generators (DGs is one of the most important protection concerns in power systems. Considering safety and reliability problems of distribution networks, an exact diagnosis index is required to discriminate the loss of the main network from the existing parallel operation. Hence, this paper introduces a new islanding detection method for synchronous machine–based DGs. This method uses the average value of the generator frequency to calculate a new detection index. The proposed method is an effective supplement of the over/under frequency protection (OFP/UFP system. The analytical equations and simulation results are used to assess the performance of the proposed method under various scenarios such as different types of faults, load changes and capacitor bank switching. To show the effectiveness of the proposed method, it is compared with the performance of both ROCOF and ROCOFOP methods.

  5. Bridging the gap between human knowledge and machine learning

    Directory of Open Access Journals (Sweden)

    Juan Carlos ALVARADO-PÉREZ

    2015-12-01

    Full Text Available Nowadays, great amount of data is being created by several sources from academic, scientific, business and industrial activities. Such data intrinsically contains meaningful information allowing for developing techniques, and have scientific validity to explore the information thereof. In this connection, the aim of artificial intelligence (AI is getting new knowledge to make decisions properly. AI has taken an important place in scientific and technology development communities, and recently develops computer-based processing devices for modern machines. Under the premise, the premise that the feedback provided by human reasoning -which is holistic, flexible and parallel- may enhance the data analysis, the need for the integration of natural and artificial intelligence has emerged. Such an integration makes the process of knowledge discovery more effective, providing the ability to easily find hidden trends and patterns belonging to the database predictive model. As well, allowing for new observations and considerations from beforehand known data by using both data analysis methods and knowledge and skills from human reasoning. In this work, we review main basics and recent works on artificial and natural intelligence integration in order to introduce users and researchers on this emergent field. As well, key aspects to conceptually compare them are provided.

  6. Combining machine learning, crowdsourcing and expert knowledge to detect chemical-induced diseases in text.

    Science.gov (United States)

    Bravo, Àlex; Li, Tong Shu; Su, Andrew I; Good, Benjamin M; Furlong, Laura I

    2016-01-01

    Drug toxicity is a major concern for both regulatory agencies and the pharmaceutical industry. In this context, text-mining methods for the identification of drug side effects from free text are key for the development of up-to-date knowledge sources on drug adverse reactions. We present a new system for identification of drug side effects from the literature that combines three approaches: machine learning, rule- and knowledge-based approaches. This system has been developed to address the Task 3.B of Biocreative V challenge (BC5) dealing with Chemical-induced Disease (CID) relations. The first two approaches focus on identifying relations at the sentence-level, while the knowledge-based approach is applied both at sentence and abstract levels. The machine learning method is based on the BeFree system using two corpora as training data: the annotated data provided by the CID task organizers and a new CID corpus developed by crowdsourcing. Different combinations of results from the three strategies were selected for each run of the challenge. In the final evaluation setting, the system achieved the highest Recall of the challenge (63%). By performing an error analysis, we identified the main causes of misclassifications and areas for improving of our system, and highlighted the need of consistent gold standard data sets for advancing the state of the art in text mining of drug side effects.Database URL: https://zenodo.org/record/29887?ln¼en#.VsL3yDLWR_V. © The Author(s) 2016. Published by Oxford University Press.

  7. Development of Web-based Virtual Training Environment for Machining

    Science.gov (United States)

    Yang, Zhixin; Wong, S. F.

    2010-05-01

    With the booming in the manufacturing sector of shoe, garments and toy, etc. in pearl region, training the usage of various facilities and design the facility layout become crucial for the success of industry companies. There is evidence that the use of virtual training may provide benefits in improving the effect of learning and reducing risk in the physical work environment. This paper proposed an advanced web-based training environment that could demonstrate the usage of a CNC machine in terms of working condition and parameters selection. The developed virtual environment could provide training at junior level and advanced level. Junior level training is to explain machining knowledge including safety factors, machine parameters (ex. material, speed, feed rate). Advanced level training enables interactive programming of NG coding and effect simulation. Operation sequence was used to assist the user to choose the appropriate machining condition. Several case studies were also carried out with animation of milling and turning operations.

  8. An Indexing Scheme for Case-Based Manufacturing Vision Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Johansen, John; Luxhøj, James T.

    2004-01-01

    with the competence improvement of an enterprises manufacturing system. There are two types of cases within the CBRM – an event case (EC) and a general supportive case (GSC). We designed one set of indexing vocabulary for the two types of cases, but a different indexing representation structure for each of them......This paper focuses on one critical element, indexing – retaining and representing knowledge in an applied case-based reasoning (CBR) model for supporting strategic manufacturing vision development (CBRM). Manufacturing vision (MV) is a kind of knowledge management concept and process concerned...

  9. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  10. Machine Learning Methods for Knowledge Discovery in Medical Data on Atherosclerosis

    Czech Academy of Sciences Publication Activity Database

    Serrano, J.I.; Tomečková, Marie; Zvárová, Jana

    2006-01-01

    Roč. 1, - (2006), s. 6-33 ISSN 1801-5603 Institutional research plan: CEZ:AV0Z10300504 Keywords : knowledge discovery * supervised machine learning * biomedical data mining * risk factors of atherosclerosis Subject RIV: BB - Applied Statistics, Operational Research

  11. Advanced Electrical Machines and Machine-Based Systems for Electric and Hybrid Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Cheng

    2015-09-01

    Full Text Available The paper presents a number of advanced solutions on electric machines and machine-based systems for the powertrain of electric vehicles (EVs. Two types of systems are considered, namely the drive systems designated to the EV propulsion and the power split devices utilized in the popular series-parallel hybrid electric vehicle architecture. After reviewing the main requirements for the electric drive systems, the paper illustrates advanced electric machine topologies, including a stator permanent magnet (stator-PM motor, a hybrid-excitation motor, a flux memory motor and a redundant motor structure. Then, it illustrates advanced electric drive systems, such as the magnetic-geared in-wheel drive and the integrated starter generator (ISG. Finally, three machine-based implementations of the power split devices are expounded, built up around the dual-rotor PM machine, the dual-stator PM brushless machine and the magnetic-geared dual-rotor machine. As a conclusion, the development trends in the field of electric machines and machine-based systems for EVs are summarized.

  12. Knowledge machines digital transformations of the sciences and humanities

    CERN Document Server

    Meyer, Eric T

    2015-01-01

    In Knowledge Machines, Eric Meyer and Ralph Schroeder argue that digital technologies have fundamentally changed research practices in the sciences, social sciences, and humanities. Meyer and Schroeder show that digital tools and data, used collectively and in distributed mode -- which they term e-research -- have transformed not just the consumption of knowledge but also the production of knowledge. Digital technologies for research are reshaping how knowledge advances in disciplines that range from physics to literary analysis. Meyer and Schroeder map the rise of digital research and offer case studies from many fields, including biomedicine, social science uses of the Web, astronomy, and large-scale textual analysis in the humanities. They consider such topics as the challenges of sharing research data and of big data approaches, disciplinary differences and new forms of interdisciplinary collaboration, the shifting boundaries between researchers and their publics, and the ways that digital tools promote o...

  13. Knowledge-based automated radiopharmaceutical manufacturing for Positron Emission Tomography

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1991-01-01

    This article describes the application of basic knowledge engineering principles to the design of automated synthesis equipment for radiopharmaceuticals used in Positron Emission Tomography (PET). Before discussing knowledge programming, an overview of the development of automated radiopharmaceutical synthesis systems for PET will be presented. Since knowledge systems will rely on information obtained from machine transducers, a discussion of the uses of sensory feedback in today's automated systems follows. Next, the operation of these automated systems is contrasted to radiotracer production carried out by chemists, and the rationale for and basic concepts of knowledge-based programming are explained. Finally, a prototype knowledge-based system supporting automated radiopharmaceutical manufacturing of 18FDG at Brookhaven National Laboratory (BNL) is described using 1stClass, a commercially available PC-based expert system shell

  14. Research on knowledge support technology for product innovation design based on quality function knowledge deployment

    Directory of Open Access Journals (Sweden)

    Kai Zhang

    2016-06-01

    Full Text Available Based on the analysis of the relationship between the process of product innovation design and knowledge, this article proposes a theoretical model of quality function knowledge deployment. In order to link up the product innovation design and the knowledge required by the designer, the iterative method of quality function knowledge deployment is refined, as well as the knowledge retrieval model and knowledge support model based on quality function knowledge deployment are established. In the whole life cycle of product design, in view of the different requirements for knowledge in conceptual design stage, components’ configuration stage, process planning stage, and production planning stage, the quality function knowledge deployment model could link up the required knowledge with the engineering characteristics, component characteristics, process characteristics, and production characteristics in the four stages using the mapping relationship between the function characteristics and the knowledge and help the designer to track the required knowledge for realizing product innovation design. In this article, an instance about rewinding machine is given to demonstrate the practicability and validity of product innovation design knowledge support technology based on quality function knowledge deployment.

  15. Applicability of internet search index for asthma admission forecast using machine learning.

    Science.gov (United States)

    Luo, Li; Liao, Chengcheng; Zhang, Fengyi; Zhang, Wei; Li, Chunyang; Qiu, Zhixin; Huang, Debin

    2018-04-15

    This study aimed to determine whether a search index could provide insight into trends in asthma admission in China. An Internet search index is a powerful tool to monitor and predict epidemic outbreaks. However, whether using an internet search index can significantly improve asthma admissions forecasts remains unknown. The long-term goal is to develop a surveillance system to help early detection and interventions for asthma and to avoid asthma health care resource shortages in advance. In this study, we used a search index combined with air pollution data, weather data, and historical admissions data to forecast asthma admissions using machine learning. Results demonstrated that the best area under the curve in the test set that can be achieved is 0.832, using all predictors mentioned earlier. A search index is a powerful predictor in asthma admissions forecast, and a recent search index can reflect current asthma admissions with a lag-effect to a certain extent. The addition of a real-time, easily accessible search index improves forecasting capabilities and demonstrates the predictive potential of search index. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    Science.gov (United States)

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  17. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  18. Fabrication Quality Analysis of a Fiber Optic Refractive Index Sensor Created by CO2 Laser Machining

    Directory of Open Access Journals (Sweden)

    Wei-Te Wu

    2013-03-01

    Full Text Available This study investigates the CO2 laser-stripped partial cladding of silica-based optic fibers with a core diameter of 400 μm, which enables them to sense the refractive index of the surrounding environment. However, inappropriate treatments during the machining process can generate a number of defects in the optic fiber sensors. Therefore, the quality of optic fiber sensors fabricated using CO2 laser machining must be analyzed. The results show that analysis of the fiber core size after machining can provide preliminary defect detection, and qualitative analysis of the optical transmission defects can be used to identify imperfections that are difficult to observe through size analysis. To more precisely and quantitatively detect fabrication defects, we included a tensile test and numerical aperture measurements in this study. After a series of quality inspections, we proposed improvements to the existing CO2 laser machining parameters, namely, a vertical scanning pathway, 4 W of power, and a feed rate of 9.45 cm/s. Using these improved parameters, we created optical fiber sensors with a core diameter of approximately 400 μm, no obvious optical transmission defects, a numerical aperture of 0.52 ± 0.019, a 0.886 Weibull modulus, and a 1.186 Weibull-shaped parameter. Finally, we used the optical fiber sensor fabricated using the improved parameters to measure the refractive indices of various solutions. The results show that a refractive-index resolution of 1.8 × 10−4 RIU (linear fitting R2 = 0.954 was achieved for sucrose solutions with refractive indices ranging between 1.333 and 1.383. We also adopted the particle plasmon resonance sensing scheme using the fabricated optical fibers. The results provided additional information, specifically, a superior sensor resolution of 5.73 × 10−5 RIU, and greater linearity at R2 = 0.999.

  19. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    Science.gov (United States)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  20. Machine Learning-based Intelligent Formal Reasoning and Proving System

    Science.gov (United States)

    Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia

    2018-03-01

    The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.

  1. MySQL based selection of appropriate indexing technique in ...

    African Journals Online (AJOL)

    Administrator

    Keywords: B-tree indexing, MySQL, Support vector machine, Smart card. 1. ..... SVM are strong classifiers in the field of machine learning and we will be using ..... We acknowledge Mr. Abhishek Roy, student of NIT Surathkal for his help in the ...

  2. Enhancing acronym/abbreviation knowledge bases with semantic information.

    Science.gov (United States)

    Torii, Manabu; Liu, Hongfang

    2007-10-11

    In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.

  3. Reverse hypothesis machine learning a practitioner's perspective

    CERN Document Server

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  4. Nonlinear Knowledge in Kernel-Based Multiple Criteria Programming Classifier

    Science.gov (United States)

    Zhang, Dongling; Tian, Yingjie; Shi, Yong

    Kernel-based Multiple Criteria Linear Programming (KMCLP) model is used as classification methods, which can learn from training examples. Whereas, in traditional machine learning area, data sets are classified only by prior knowledge. Some works combine the above two classification principle to overcome the defaults of each approach. In this paper, we propose a model to incorporate the nonlinear knowledge into KMCLP in order to solve the problem when input consists of not only training example, but also nonlinear prior knowledge. In dealing with real world case breast cancer diagnosis, the model shows its better performance than the model solely based on training data.

  5. The Total Energy Efficiency Index for machine tools

    International Nuclear Information System (INIS)

    Schudeleit, Timo; Züst, Simon; Weiss, Lukas; Wegener, Konrad

    2016-01-01

    Energy efficiency in industries is one of the dominating challenges of the 21st century. Since the release of the eco-design directive 2005/32/EC in 2005, great research effort has been spent on the energy efficiency assessment for energy using products. The ISO (International Organization for Standardization) standardization body (ISO/TC 39 WG 12) currently works on the ISO 14955 series in order to enable the assessment of energy efficient design of machine tools. A missing piece for completion of the ISO 14955 series is a metric to quantify the design of machine tools regarding energy efficiency based on the respective assembly of components. The metric needs to take into account each machine tool components' efficiency and the need-oriented utilization in combination with the other components while referring to efficiency limits. However, a state of the art review reveals that none of the existing metrics is feasible to adequately match this goal. This paper presents a metric that matches all these criteria to promote the development of the ISO 14955 series. The applicability of the metric is proven in a practical case study on a turning machine. - Highlights: • Study for pushing forward the standardization work on the ISO 14955 series. • Review of existing energy efficiency indicators regarding three basic strategies to foster sustainability. • Development of a metric comprising the three basic strategies to foster sustainability. • Metric application for quantifying the energy efficiency of a turning machine.

  6. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  7. The Optimisation of Processing Condition for Injected Mould Polypropylene-Nanoclay-Gigantochloa Scortechinii based on Melt Flow Index

    Science.gov (United States)

    Othman, M. H.; Rosli, M. S.; Hasan, S.; Amin, A. M.; Hashim, M. Y.; Marwah, O. M. F.; Amin, S. Y. M.

    2018-03-01

    The fundamental knowledge of flow behaviour is essential in producing various plastic parts injection moulding process. Moreover, the adaptation of advanced polymer-nanocomposites such as polypropylene-nanoclay with natural fibres, for instance Gigantochloa Scortechinii may boost up the mechanical properties of the parts. Therefore, this project was proposed with the objective to optimise the processing condition of injected mould polypropylene-nanoclay-Gigantochloa Scortechini fibres based on the flow behaviour, which was melt flow index. At first, Gigantochloa Scortechinii fibres have to be preheated at temperature 120°C and then mixed with polypropylene, maleic anhydride modified polypropylene oligomers (PPgMA) and nanoclay by using Brabender Plastograph machine. Next, forms of pellets were produced from the samples by using Granulator machine for use in the injection moulding process. The design of experiments that was used in the injection moulding process was Taguchi Method Orthogonal Array -L934. Melt Flow Index (MF) was selected as the response. Based on the results, the value of MFI increased when the fiber content increase from 0% to 3%, which was 17.78 g/10min to 22.07 g/10min and decreased from 3% to 6%, which was 22.07 g/10min to 20.05 g/10min and 3%, which gives the highest value of MFI. Based on the signal to ratio analysis, the most influential parameter that affects the value of MFI was the melt temperature. The optimum parameter for 3% were 170°C melt temperature, 35% packing pressure, 30% screw speed and 3 second filling time.

  8. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    Science.gov (United States)

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  9. JACOS: AI-based simulation system for man-machine system behavior in NPP

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Yokobayashi, Masao; Tanabe, Fumiya; Komiya, Akitoshi

    2001-08-01

    A prototype of a computer simulation system named JACOS (JAERI COgnitive Simulation system) has been developed at JAERI (Japan Atomic Energy Research Institute) to simulate the man-machine system behavior in which both the cognitive behavior of a human operator and the plant behavior affect each other. The objectives of this system development is to provide man-machine system analysts with detailed information on the cognitive process of an operator and the plant behavior affected by operator's actions in accidental situations of a nuclear power plant. The simulation system consists of an operator model and a plant model which are coupled dynamically. The operator model simulates an operator's cognitive behavior in accidental situations based on the decision ladder model of Rasmussen, and is implemented using the AI-techniques of the distributed cooperative inference method with the so-called blackboard architecture. Rule-based behavior is simulated using knowledge representation with If-Then type of rules. Knowledge-based behavior is simulated using knowledge representation with MFM (Multilevel Flow Modeling) and qualitative reasoning method. Cognitive characteristics of attentional narrowing, limitation of short-term memory, and knowledge recalling from long-term memory are also taken into account. The plant model of a 3-loop PWR is also developed using a best estimate thermal-hydraulic analysis code RELAP5/MOD2. This report is prepared as User's Manual for JACOS. The first chapter of this report describes both operator and plant models in detail. The second chapter includes instructive descriptions for program installation, building of a knowledge base for operator model, execution of simulation and analysis of simulation results. The examples of simulation with JACOS are shown in the third chapter. (author)

  10. A novel root-index based prioritized random access scheme for 5G cellular networks

    Directory of Open Access Journals (Sweden)

    Taehoon Kim

    2015-12-01

    Full Text Available Cellular networks will play an important role in realizing the newly emerging Internet-of-Everything (IoE. One of the challenging issues is to support the quality of service (QoS during the access phase, while accommodating a massive number of machine nodes. In this paper, we show a new paradigm of multiple access priorities in random access (RA procedure and propose a novel root-index based prioritized random access (RIPRA scheme that implicitly embeds the access priority in the root index of the RA preambles. The performance evaluation shows that the proposed RIPRA scheme can successfully support differentiated performance for different access priority levels, even though there exist a massive number of machine nodes.

  11. A hybrid training approach for leaf area index estimation via Cubist and random forests machine-learning

    KAUST Repository

    McCabe, Matthew

    2017-12-06

    With an increasing volume and dimensionality of Earth observation data, enhanced integration of machine-learning methodologies is needed to effectively analyze and utilize these information rich datasets. In machine-learning, a training dataset is required to establish explicit associations between a suite of explanatory ‘predictor’ variables and the target property. The specifics of this learning process can significantly influence model validity and portability, with a higher generalization level expected with an increasing number of observable conditions being reflected in the training dataset. Here we propose a hybrid training approach for leaf area index (LAI) estimation, which harnesses synergistic attributes of scattered in-situ measurements and systematically distributed physically based model inversion results to enhance the information content and spatial representativeness of the training data. To do this, a complimentary training dataset of independent LAI was derived from a regularized model inversion of RapidEye surface reflectances and subsequently used to guide the development of LAI regression models via Cubist and random forests (RF) decision tree methods. The application of the hybrid training approach to a broad set of Landsat 8 vegetation index (VI) predictor variables resulted in significantly improved LAI prediction accuracies and spatial consistencies, relative to results relying on in-situ measurements alone for model training. In comparing the prediction capacity and portability of the two machine-learning algorithms, a pair of relatively simple multi-variate regression models established by Cubist performed best, with an overall relative mean absolute deviation (rMAD) of ∼11%, determined based on a stringent scene-specific cross-validation approach. In comparison, the portability of RF regression models was less effective (i.e., an overall rMAD of ∼15%), which was attributed partly to model saturation at high LAI in association

  12. A hybrid training approach for leaf area index estimation via Cubist and random forests machine-learning

    Science.gov (United States)

    Houborg, Rasmus; McCabe, Matthew F.

    2018-01-01

    With an increasing volume and dimensionality of Earth observation data, enhanced integration of machine-learning methodologies is needed to effectively analyze and utilize these information rich datasets. In machine-learning, a training dataset is required to establish explicit associations between a suite of explanatory 'predictor' variables and the target property. The specifics of this learning process can significantly influence model validity and portability, with a higher generalization level expected with an increasing number of observable conditions being reflected in the training dataset. Here we propose a hybrid training approach for leaf area index (LAI) estimation, which harnesses synergistic attributes of scattered in-situ measurements and systematically distributed physically based model inversion results to enhance the information content and spatial representativeness of the training data. To do this, a complimentary training dataset of independent LAI was derived from a regularized model inversion of RapidEye surface reflectances and subsequently used to guide the development of LAI regression models via Cubist and random forests (RF) decision tree methods. The application of the hybrid training approach to a broad set of Landsat 8 vegetation index (VI) predictor variables resulted in significantly improved LAI prediction accuracies and spatial consistencies, relative to results relying on in-situ measurements alone for model training. In comparing the prediction capacity and portability of the two machine-learning algorithms, a pair of relatively simple multi-variate regression models established by Cubist performed best, with an overall relative mean absolute deviation (rMAD) of ∼11%, determined based on a stringent scene-specific cross-validation approach. In comparison, the portability of RF regression models was less effective (i.e., an overall rMAD of ∼15%), which was attributed partly to model saturation at high LAI in association with

  13. A hybrid training approach for leaf area index estimation via Cubist and random forests machine-learning

    KAUST Repository

    McCabe, Matthew; McCabe, Matthew

    2017-01-01

    With an increasing volume and dimensionality of Earth observation data, enhanced integration of machine-learning methodologies is needed to effectively analyze and utilize these information rich datasets. In machine-learning, a training dataset is required to establish explicit associations between a suite of explanatory ‘predictor’ variables and the target property. The specifics of this learning process can significantly influence model validity and portability, with a higher generalization level expected with an increasing number of observable conditions being reflected in the training dataset. Here we propose a hybrid training approach for leaf area index (LAI) estimation, which harnesses synergistic attributes of scattered in-situ measurements and systematically distributed physically based model inversion results to enhance the information content and spatial representativeness of the training data. To do this, a complimentary training dataset of independent LAI was derived from a regularized model inversion of RapidEye surface reflectances and subsequently used to guide the development of LAI regression models via Cubist and random forests (RF) decision tree methods. The application of the hybrid training approach to a broad set of Landsat 8 vegetation index (VI) predictor variables resulted in significantly improved LAI prediction accuracies and spatial consistencies, relative to results relying on in-situ measurements alone for model training. In comparing the prediction capacity and portability of the two machine-learning algorithms, a pair of relatively simple multi-variate regression models established by Cubist performed best, with an overall relative mean absolute deviation (rMAD) of ∼11%, determined based on a stringent scene-specific cross-validation approach. In comparison, the portability of RF regression models was less effective (i.e., an overall rMAD of ∼15%), which was attributed partly to model saturation at high LAI in association

  14. Functional networks inference from rule-based machine learning models.

    Science.gov (United States)

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  15. Supervised machine learning algorithms to diagnose stress for vehicle drivers based on physiological sensor signals.

    Science.gov (United States)

    Barua, Shaibal; Begum, Shahina; Ahmed, Mobyen Uddin

    2015-01-01

    Machine learning algorithms play an important role in computer science research. Recent advancement in sensor data collection in clinical sciences lead to a complex, heterogeneous data processing, and analysis for patient diagnosis and prognosis. Diagnosis and treatment of patients based on manual analysis of these sensor data are difficult and time consuming. Therefore, development of Knowledge-based systems to support clinicians in decision-making is important. However, it is necessary to perform experimental work to compare performances of different machine learning methods to help to select appropriate method for a specific characteristic of data sets. This paper compares classification performance of three popular machine learning methods i.e., case-based reasoning, neutral networks and support vector machine to diagnose stress of vehicle drivers using finger temperature and heart rate variability. The experimental results show that case-based reasoning outperforms other two methods in terms of classification accuracy. Case-based reasoning has achieved 80% and 86% accuracy to classify stress using finger temperature and heart rate variability. On contrary, both neural network and support vector machine have achieved less than 80% accuracy by using both physiological signals.

  16. A Performance Survey on Stack-based and Register-based Virtual Machines

    OpenAIRE

    Fang, Ruijie; Liu, Siqi

    2016-01-01

    Virtual machines have been widely adapted for high-level programming language implementations and for providing a degree of platform neutrality. As the overall use and adaptation of virtual machines grow, the overall performance of virtual machines has become a widely-discussed topic. In this paper, we present a survey on the performance differences of the two most widely adapted types of virtual machines - the stack-based virtual machine and the register-based virtual machine - using various...

  17. Examining the Association Between School Vending Machines and Children's Body Mass Index by Socioeconomic Status.

    Science.gov (United States)

    O'Hara, Jeffrey K; Haynes-Maslow, Lindsey

    2015-01-01

    To examine the association between vending machine availability in schools and body mass index (BMI) among subgroups of children based on gender, race/ethnicity, and socioeconomic status classifications. First-difference multivariate regressions were estimated using longitudinal fifth- and eighth-grade data from the Early Childhood Longitudinal Study. The specifications were disaggregated by gender, race/ethnicity, and family socioeconomic status classifications. Vending machine availability had a positive association (P < .10) with BMI among Hispanic male children and low-income Hispanic children. Living in an urban location (P < .05) and hours watching television (P < .05) were also positively associated with BMI for these subgroups. Supplemental Nutrition Assistance Program enrollment was negatively associated with BMI for low-income Hispanic students (P < .05). These findings were not statistically significant when using Bonferroni adjusted critical values. The results suggest that the school food environment could reinforce health disparities that exist for Hispanic male children and low-income Hispanic children. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  18. JACOS: AI-based simulation system for man-machine system behavior in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Kazuo; Yokobayashi, Masao; Tanabe, Fumiya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kawase, Katsumi [CSK Corp., Tokyo (Japan); Komiya, Akitoshi [Computer Associated Laboratory, Inc., Hitachinaka, Ibaraki (Japan)

    2001-08-01

    A prototype of a computer simulation system named JACOS (JAERI COgnitive Simulation system) has been developed at JAERI (Japan Atomic Energy Research Institute) to simulate the man-machine system behavior in which both the cognitive behavior of a human operator and the plant behavior affect each other. The objectives of this system development is to provide man-machine system analysts with detailed information on the cognitive process of an operator and the plant behavior affected by operator's actions in accidental situations of a nuclear power plant. The simulation system consists of an operator model and a plant model which are coupled dynamically. The operator model simulates an operator's cognitive behavior in accidental situations based on the decision ladder model of Rasmussen, and is implemented using the AI-techniques of the distributed cooperative inference method with the so-called blackboard architecture. Rule-based behavior is simulated using knowledge representation with If-Then type of rules. Knowledge-based behavior is simulated using knowledge representation with MFM (Multilevel Flow Modeling) and qualitative reasoning method. Cognitive characteristics of attentional narrowing, limitation of short-term memory, and knowledge recalling from long-term memory are also taken into account. The plant model of a 3-loop PWR is also developed using a best estimate thermal-hydraulic analysis code RELAP5/MOD2. This report is prepared as User's Manual for JACOS. The first chapter of this report describes both operator and plant models in detail. The second chapter includes instructive descriptions for program installation, building of a knowledge base for operator model, execution of simulation and analysis of simulation results. The examples of simulation with JACOS are shown in the third chapter. (author)

  19. Development of Type 2 Diabetes Mellitus Phenotyping Framework Using Expert Knowledge and Machine Learning Approach.

    Science.gov (United States)

    Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko

    2017-07-01

    Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.

  20. Machine function based control code algebras

    NARCIS (Netherlands)

    Bergstra, J.A.

    Machine functions have been introduced by Earley and Sturgis in [6] in order to provide a mathematical foundation of the use of the T-diagrams proposed by Bratman in [5]. Machine functions describe the operation of a machine at a very abstract level. A theory of hardware and software based on

  1. Virk: An Active Learning-based System for Bootstrapping Knowledge Base Development in the Neurosciences

    Directory of Open Access Journals (Sweden)

    Kyle H. Ambert

    2013-12-01

    Full Text Available The frequency and volume of newly-published scientific literature is quickly making manual maintenance of publicly-available databases of primary data unrealistic and costly. Although machine learning can be useful for developing automated approaches to identifying scientific publications containing relevant information for a database, developing such tools necessitates manually annotating an unrealistic number of documents. One approach to this problem, active learning, builds classification models by iteratively identifying documents that provide the most information to a classifier. Although this approach has been shown to be effective for related problems, in the context of scientific databases curation, it falls short. We present Virk, an active learning system that, while being trained, simultaneously learns a classification model and identifies documents having information of interest for a knowledge base. Our approach uses a support vector machine classifier with input features derived from neuroscience-related publications from the primary literature. Using our approach, we were able to increase the size of the Neuron Registry, a knowledge base of neuron-related information, by a factor of 90%, a knowledge base of neuron-related information, in 3 months. Using standard biocuration methods, it would have taken between 1-2 years to make the same number of contributions to the Neuron Registry. Here, we describe the system pipeline in detail, and evaluate its performance against other approaches to sampling in active learning.

  2. Rule based systems for big data a machine learning approach

    CERN Document Server

    Liu, Han; Cocea, Mihaela

    2016-01-01

    The ideas introduced in this book explore the relationships among rule based systems, machine learning and big data. Rule based systems are seen as a special type of expert systems, which can be built by using expert knowledge or learning from real data. The book focuses on the development and evaluation of rule based systems in terms of accuracy, efficiency and interpretability. In particular, a unified framework for building rule based systems, which consists of the operations of rule generation, rule simplification and rule representation, is presented. Each of these operations is detailed using specific methods or techniques. In addition, this book also presents some ensemble learning frameworks for building ensemble rule based systems.

  3. DeepDive: Declarative Knowledge Base Construction.

    Science.gov (United States)

    De Sa, Christopher; Ratner, Alex; Ré, Christopher; Shin, Jaeho; Wang, Feiran; Wu, Sen; Zhang, Ce

    2016-03-01

    The dark data extraction or knowledge base construction (KBC) problem is to populate a SQL database with information from unstructured data sources including emails, webpages, and pdf reports. KBC is a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration. We describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems. The key idea in DeepDive is that statistical inference and machine learning are key tools to attack classical data problems in extraction, cleaning, and integration in a unified and more effective manner. DeepDive programs are declarative in that one cannot write probabilistic inference algorithms; instead, one interacts by defining features or rules about the domain. A key reason for this design choice is to enable domain experts to build their own KBC systems. We present the applications, abstractions, and techniques of DeepDive employed to accelerate construction of KBC systems.

  4. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  5. Machinability of nickel based alloys using electrical discharge machining process

    Science.gov (United States)

    Khan, M. Adam; Gokul, A. K.; Bharani Dharan, M. P.; Jeevakarthikeyan, R. V. S.; Uthayakumar, M.; Thirumalai Kumaran, S.; Duraiselvam, M.

    2018-04-01

    The high temperature materials such as nickel based alloys and austenitic steel are frequently used for manufacturing critical aero engine turbine components. Literature on conventional and unconventional machining of steel materials is abundant over the past three decades. However the machining studies on superalloy is still a challenging task due to its inherent property and quality. Thus this material is difficult to be cut in conventional processes. Study on unconventional machining process for nickel alloys is focused in this proposed research. Inconel718 and Monel 400 are the two different candidate materials used for electrical discharge machining (EDM) process. Investigation is to prepare a blind hole using copper electrode of 6mm diameter. Electrical parameters are varied to produce plasma spark for diffusion process and machining time is made constant to calculate the experimental results of both the material. Influence of process parameters on tool wear mechanism and material removal are considered from the proposed experimental design. While machining the tool has prone to discharge more materials due to production of high energy plasma spark and eddy current effect. The surface morphology of the machined surface were observed with high resolution FE SEM. Fused electrode found to be a spherical structure over the machined surface as clumps. Surface roughness were also measured with surface profile using profilometer. It is confirmed that there is no deviation and precise roundness of drilling is maintained.

  6. An intelligent human-machine system based on an ecological interface design concept

    International Nuclear Information System (INIS)

    Naito, N.

    1995-01-01

    It seems both necessary and promising to develop an intelligent human-machine system, considering the objective of the human-machine system and the recent advance in cognitive engineering and artificial intelligence together with the ever-increasing importance of human factor issues in nuclear power plant operation and maintenance. It should support human operators in their knowledge-based behaviour and allow them to cope with unanticipated abnormal events, including recovery from erroneous human actions. A top-down design approach has been adopted based on cognitive work analysis, and (1) an ecological interface, (2) a cognitive model-based advisor and (3) a robust automatic sequence controller have been established. These functions have been integrated into an experimental control room. A validation test was carried out by the participation of experienced operators and engineers. The results showed the usefulness of this system in supporting the operator's supervisory plant control tasks. ((orig.))

  7. Ontology-Based Knowledge Organization for the Radiograph Images Segmentation

    Directory of Open Access Journals (Sweden)

    MATEI, O.

    2008-04-01

    Full Text Available The quantity of thoracic radiographies in the medical field is ever growing. An automated system for segmenting the images would help doctors enormously. Some approaches are knowledge-based; therefore we propose here an ontology for this purpose. Thus it is machine oriented, rather than human-oriented. That is all the structures visible on a thoracic image are described from a technical point of view.

  8. Preliminary Test of Upgraded Conventional Milling Machine into PC Based CNC Milling Machine

    International Nuclear Information System (INIS)

    Abdul Hafid

    2008-01-01

    CNC (Computerized Numerical Control) milling machine yields a challenge to make an innovation in the field of machining. With an action job is machining quality equivalent to CNC milling machine, the conventional milling machine ability was improved to be based on PC CNC milling machine. Mechanically and instrumentally change. As a control replacing was conducted by servo drive and proximity were used. Computer programme was constructed to give instruction into milling machine. The program structure of consists GUI model and ladder diagram. Program was put on programming systems called RTX software. The result of up-grade is computer programming and CNC instruction job. The result was beginning step and it will be continued in next time. With upgrading ability milling machine becomes user can be done safe and optimal from accident risk. By improving performance of milling machine, the user will be more working optimal and safely against accident risk. (author)

  9. Component Pin Recognition Using Algorithms Based on Machine Learning

    Science.gov (United States)

    Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang

    2018-04-01

    The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.

  10. Advanced Electrical Machines and Machine-Based Systems for Electric and Hybrid Vehicles

    OpenAIRE

    Ming Cheng; Le Sun; Giuseppe Buja; Lihua Song

    2015-01-01

    The paper presents a number of advanced solutions on electric machines and machine-based systems for the powertrain of electric vehicles (EVs). Two types of systems are considered, namely the drive systems designated to the EV propulsion and the power split devices utilized in the popular series-parallel hybrid electric vehicle architecture. After reviewing the main requirements for the electric drive systems, the paper illustrates advanced electric machine topologies, including a stator perm...

  11. Diagnosing tuberculosis with a novel support vector machine-based artificial immune recognition system.

    Science.gov (United States)

    Saybani, Mahmoud Reza; Shamshirband, Shahaboddin; Golzari Hormozi, Shahram; Wah, Teh Ying; Aghabozorgi, Saeed; Pourhoseingholi, Mohamad Amin; Olariu, Teodora

    2015-04-01

    Tuberculosis (TB) is a major global health problem, which has been ranked as the second leading cause of death from an infectious disease worldwide. Diagnosis based on cultured specimens is the reference standard, however results take weeks to process. Scientists are looking for early detection strategies, which remain the cornerstone of tuberculosis control. Consequently there is a need to develop an expert system that helps medical professionals to accurately and quickly diagnose the disease. Artificial Immune Recognition System (AIRS) has been used successfully for diagnosing various diseases. However, little effort has been undertaken to improve its classification accuracy. In order to increase the classification accuracy of AIRS, this study introduces a new hybrid system that incorporates a support vector machine into AIRS for diagnosing tuberculosis. Patient epacris reports obtained from the Pasteur laboratory of Iran were used as the benchmark data set, with the sample size of 175 (114 positive samples for TB and 60 samples in the negative group). The strategy of this study was to ensure representativeness, thus it was important to have an adequate number of instances for both TB and non-TB cases. The classification performance was measured through 10-fold cross-validation, Root Mean Squared Error (RMSE), sensitivity and specificity, Youden's Index, and Area Under the Curve (AUC). Statistical analysis was done using the Waikato Environment for Knowledge Analysis (WEKA), a machine learning program for windows. With an accuracy of 100%, sensitivity of 100%, specificity of 100%, Youden's Index of 1, Area Under the Curve of 1, and RMSE of 0, the proposed method was able to successfully classify tuberculosis patients. There have been many researches that aimed at diagnosing tuberculosis faster and more accurately. Our results described a model for diagnosing tuberculosis with 100% sensitivity and 100% specificity. This model can be used as an additional tool for

  12. The Nature Index: A General Framework for Synthesizing Knowledge on the State of Biodiversity

    Science.gov (United States)

    Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I.; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G.; Nybø, Signe

    2011-01-01

    The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide. PMID:21526118

  13. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  14. Beyond the Keyword Barrier: Knowledge-Based Information Retrieval.

    Science.gov (United States)

    Mauldin, Michael; And Others

    1987-01-01

    Describes the inability of traditional subject index terms to represent relational information among concepts, and the development of frame based knowledge representation methods that provide relational semantic representations of documents and user queries. The discussion covers research in user interfaces and automatic document classifications,…

  15. Machine Vision-Based Measurement Systems for Fruit and Vegetable Quality Control in Postharvest.

    Science.gov (United States)

    Blasco, José; Munera, Sandra; Aleixos, Nuria; Cubero, Sergio; Molto, Enrique

    Individual items of any agricultural commodity are different from each other in terms of colour, shape or size. Furthermore, as they are living thing, they change their quality attributes over time, thereby making the development of accurate automatic inspection machines a challenging task. Machine vision-based systems and new optical technologies make it feasible to create non-destructive control and monitoring tools for quality assessment to ensure adequate accomplishment of food standards. Such systems are much faster than any manual non-destructive examination of fruit and vegetable quality, thus allowing the whole production to be inspected with objective and repeatable criteria. Moreover, current technology makes it possible to inspect the fruit in spectral ranges beyond the sensibility of the human eye, for instance in the ultraviolet and near-infrared regions. Machine vision-based applications require the use of multiple technologies and knowledge, ranging from those related to image acquisition (illumination, cameras, etc.) to the development of algorithms for spectral image analysis. Machine vision-based systems for inspecting fruit and vegetables are targeted towards different purposes, from in-line sorting into commercial categories to the detection of contaminants or the distribution of specific chemical compounds on the product's surface. This chapter summarises the current state of the art in these techniques, starting with systems based on colour images for the inspection of conventional colour, shape or external defects and then goes on to consider recent developments in spectral image analysis for internal quality assessment or contaminant detection.

  16. Knowledge-based Fragment Binding Prediction

    Science.gov (United States)

    Tang, Grace W.; Altman, Russ B.

    2014-01-01

    Target-based drug discovery must assess many drug-like compounds for potential activity. Focusing on low-molecular-weight compounds (fragments) can dramatically reduce the chemical search space. However, approaches for determining protein-fragment interactions have limitations. Experimental assays are time-consuming, expensive, and not always applicable. At the same time, computational approaches using physics-based methods have limited accuracy. With increasing high-resolution structural data for protein-ligand complexes, there is now an opportunity for data-driven approaches to fragment binding prediction. We present FragFEATURE, a machine learning approach to predict small molecule fragments preferred by a target protein structure. We first create a knowledge base of protein structural environments annotated with the small molecule substructures they bind. These substructures have low-molecular weight and serve as a proxy for fragments. FragFEATURE then compares the structural environments within a target protein to those in the knowledge base to retrieve statistically preferred fragments. It merges information across diverse ligands with shared substructures to generate predictions. Our results demonstrate FragFEATURE's ability to rediscover fragments corresponding to the ligand bound with 74% precision and 82% recall on average. For many protein targets, it identifies high scoring fragments that are substructures of known inhibitors. FragFEATURE thus predicts fragments that can serve as inputs to fragment-based drug design or serve as refinement criteria for creating target-specific compound libraries for experimental or computational screening. PMID:24762971

  17. A Novel Machine Learning Strategy Based on Two-Dimensional Numerical Models in Financial Engineering

    Directory of Open Access Journals (Sweden)

    Qingzhen Xu

    2013-01-01

    Full Text Available Machine learning is the most commonly used technique to address larger and more complex tasks by analyzing the most relevant information already present in databases. In order to better predict the future trend of the index, this paper proposes a two-dimensional numerical model for machine learning to simulate major U.S. stock market index and uses a nonlinear implicit finite-difference method to find numerical solutions of the two-dimensional simulation model. The proposed machine learning method uses partial differential equations to predict the stock market and can be extensively used to accelerate large-scale data processing on the history database. The experimental results show that the proposed algorithm reduces the prediction error and improves forecasting precision.

  18. Index-based reactive power compensation scheme for voltage regulation

    Science.gov (United States)

    Dike, Damian Obioma

    2008-10-01

    Increasing demand for electrical power arising from deregulation and the restrictions posed to the construction of new transmission lines by environment, socioeconomic, and political issues had led to higher grid loading. Consequently, voltage instability has become a major concern, and reactive power support is vital to enhance transmission grid performance. Improved reactive power support to distressed grid is possible through the application of relatively unfamiliar emerging technologies of "Flexible AC Transmission Systems (FACTS)" devices and "Distributed Energy Resources (DERS)." In addition to these infrastructure issues, a lack of situational awareness by system operators can cause major power outages as evidenced by the August 14, 2003 widespread North American blackout. This and many other recent major outages have highlighted the inadequacies of existing power system indexes. In this work, a novel "Index-based reactive compensation scheme" appropriate for both on-line and off-line computation of grid status has been developed. A new voltage stability index (Ls-index) suitable for long transmission lines was developed, simulated, and compared to the existing two-machine modeled L-index. This showed the effect of long distance power wheeling amongst regional transmission organizations. The dissertation further provided models for index modulated voltage source converters (VSC) and index-based load flow analysis of both FACTS and microgrid interconnected power systems using the Newton-Raphson's load flow model incorporated with multi-FACTS devices. The developed package has been made user-friendly through the embodiment of interactive graphical user interface and implemented on the IEEE 14, 30, and 300 bus systems. The results showed reactive compensation has system wide-effect, provided readily accessible system status indicators, ensured seamless DERs interconnection through new islanding modes and enhanced VSC utilization. These outcomes may contribute

  19. Passivity-Based Control of a Class of Blondel-Park Transformable Electric Machines

    Directory of Open Access Journals (Sweden)

    Per J. Nicklasson

    1997-10-01

    Full Text Available In this paper we study the viability of extending, to the general rotating electric machine's model, the passivity-based controller method that we have developed for induction motors. In this approach the passivity (energy dissipation properties of the motor are taken advantage of at two different levels. First, we prove that the motor model can be decomposed as the feedback interconnection of two passive subsystems, which can essentially be identified with the electrical and mechanical dynamics. Then, we design a torque tracking controller that preserves passivity for the electrical subsystem, and leave the mechanical part as a "passive disturbance". In position or speed control applications this procedure naturally leads to the well known cascaded controller structure which is typically analyzed invoking time-scale separation assumptions. A key feature of the new cascaded control paradigm is that the latter arguments are obviated in the stability analysis. Our objective in this paper is to characterize a class of machines for which such a passivity-based controller solves the output feedback torque tracking problem. Roughly speaking, the class consists of machines whose nonactuated dynamics are well damped and whose electrical and mechanical dynamics can be suitably decoupled via a coordinate transformation. The first condition translates into the requirement of approximate knowledge of the rotor resistances to avoid the need of injecting high gain into the loop. The latter condition is known in the electric machines literature as Blondel-Park transformability, and in practical terms it requires that the air-gap magnetomotive force must be suitably approximated by the first harmonic in its Fourier expansion. These conditions, stemming from the construction of the machine, have a clear physical interpretation in terms of the couplings between its electrical, magnetic and mechanical dynamics, and are satisfied by a large number of practical

  20. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  1. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  2. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P J [VTT Electronics, Oulu (Finland). Embedded Software

    1998-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  3. Pattern recognition & machine learning

    CERN Document Server

    Anzai, Y

    1992-01-01

    This is the first text to provide a unified and self-contained introduction to visual pattern recognition and machine learning. It is useful as a general introduction to artifical intelligence and knowledge engineering, and no previous knowledge of pattern recognition or machine learning is necessary. Basic for various pattern recognition and machine learning methods. Translated from Japanese, the book also features chapter exercises, keywords, and summaries.

  4. Training Knowledge Bots for Physics-Based Simulations Using Artificial Neural Networks

    Science.gov (United States)

    Samareh, Jamshid A.; Wong, Jay Ming

    2014-01-01

    Millions of complex physics-based simulations are required for design of an aerospace vehicle. These simulations are usually performed by highly trained and skilled analysts, who execute, monitor, and steer each simulation. Analysts rely heavily on their broad experience that may have taken 20-30 years to accumulate. In addition, the simulation software is complex in nature, requiring significant computational resources. Simulations of system of systems become even more complex and are beyond human capacity to effectively learn their behavior. IBM has developed machines that can learn and compete successfully with a chess grandmaster and most successful jeopardy contestants. These machines are capable of learning some complex problems much faster than humans can learn. In this paper, we propose using artificial neural network to train knowledge bots to identify the idiosyncrasies of simulation software and recognize patterns that can lead to successful simulations. We examine the use of knowledge bots for applications of computational fluid dynamics (CFD), trajectory analysis, commercial finite-element analysis software, and slosh propellant dynamics. We will show that machine learning algorithms can be used to learn the idiosyncrasies of computational simulations and identify regions of instability without including any additional information about their mathematical form or applied discretization approaches.

  5. Translating DVD Subtitles English-German, English-Japanese, Using Example-based Machine Translation

    DEFF Research Database (Denmark)

    Armstrong, Stephen; Caffrey, Colm; Flanagan, Marian

    2006-01-01

    Due to limited budgets and an ever-diminishing time-frame for the production of subtitles for movies released in cinema and DVD, there is a compelling case for a technology-based translation solution for subtitles. In this paper we describe how an Example-Based Machine Translation (EBMT) approach...... to the translation of English DVD subtitles into German and Japanese can aid the subtitler. Our research focuses on an EBMT tool that produces fully automated translations, which in turn can be edited if required. To our knowledge this is the first time that any EBMT approach has been used with DVD subtitle...

  6. A Review of Virtual Machine Attack Based on Xen

    Directory of Open Access Journals (Sweden)

    Ren xun-yi

    2016-01-01

    Full Text Available Virtualization technology as the foundation of cloud computing gets more and more attention because the cloud computing has been widely used. Analyzing the threat with the security of virtual machine and summarizing attack about virtual machine based on XEN to predict visible security hidden recently. Base on this paper can provide a reference for the further research on the security of virtual machine.

  7. Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Minoo Aminian

    2014-01-01

    Full Text Available We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC clades. The proposed knowledge-based Bayesian network (KBBN treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes, since these are routinely gathered from MTBC isolates of tuberculosis (TB patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web.

  8. Machine Shop Lathes.

    Science.gov (United States)

    Dunn, James

    This guide, the second in a series of five machine shop curriculum manuals, was designed for use in machine shop courses in Oklahoma. The purpose of the manual is to equip students with basic knowledge and skills that will enable them to enter the machine trade at the machine-operator level. The curriculum is designed so that it can be used in…

  9. Exploring the Knowledge Management Index as a Performance Diagnostic Tool

    Directory of Open Access Journals (Sweden)

    Jakov Crnkovic

    2005-04-01

    Full Text Available The knowledge management index (KMI has been proposed as a parsimonious and useful tool to help organizations gauge their knowledge management (KM capabilities. This may be the first step in understanding the difference between what an organization is currently doing and what it needs to do in order to maintain and improve its performance level. At the macro level, the index enables organizations to compare themselves with each other. At the micro level, it calls attention to areas needing improvement in current and future KM initiatives. In either case, the KMI provides a robust indicator and basis for business decision-making and organizational support and development. This paper presents a holistic approach to KM that relates key knowledge management processes (KMP and critical success factors (CSF needed to successfully implement it. By juxtaposing these processes and success factors, we create Belardo's matrix that will enable us to characterize an organization and estimate the KMI. At the macro level, we used realized KMI values and OP estimates to confirm the positive correlation between the KMI and OP. Additional findings include comparing the current and expected role of KM in organizations and discussion for marginal values of rows (CSF and columns (KM Processes of the proposed matrix.

  10. The analysis phase in development of knowledge-based systems

    International Nuclear Information System (INIS)

    Brooking, A.G.

    1986-01-01

    Over the past twenty years computer scientists have realized that, in order to produce reliable software that is easily modifiable, a proven methodology is required. Unlike conventional systems there is little knowledge of the life cycle of these knowledge-based systems. However, if the life cycle of conventional systems, it is not unreasonable to assume that analysis will come first. With respect to the analysis task there is an enormous difference in types of analysis. Conventional systems analysis is predominately concerned with what happens within the system. Typically, procedures will be noted in the way they relate to each other, the way data moves and changes within the system. There is often an example, on paper or machine, that can be observed

  11. Sensitivity analysis of land unit suitability for conservation using a knowledge-based system.

    Science.gov (United States)

    Humphries, Hope C; Bourgeron, Patrick S; Reynolds, Keith M

    2010-08-01

    The availability of spatially continuous data layers can have a strong impact on selection of land units for conservation purposes. The suitability of ecological conditions for sustaining the targets of conservation is an important consideration in evaluating candidate conservation sites. We constructed two fuzzy logic-based knowledge bases to determine the conservation suitability of land units in the interior Columbia River basin using NetWeaver software in the Ecosystem Management Decision Support application framework. Our objective was to assess the sensitivity of suitability ratings, derived from evaluating the knowledge bases, to fuzzy logic function parameters and to the removal of data layers (land use condition, road density, disturbance regime change index, vegetation change index, land unit size, cover type size, and cover type change index). The amount and geographic distribution of suitable land polygons was most strongly altered by the removal of land use condition, road density, and land polygon size. Removal of land use condition changed suitability primarily on private or intensively-used public land. Removal of either road density or land polygon size most strongly affected suitability on higher-elevation US Forest Service land containing small-area biophysical environments. Data layers with the greatest influence differed in rank between the two knowledge bases. Our results reinforce the importance of including both biophysical and socio-economic attributes to determine the suitability of land units for conservation. The sensitivity tests provided information about knowledge base structuring and parameterization as well as prioritization for future data needs.

  12. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach

    Science.gov (United States)

    Tiun, Sabrina; AL-Dhief, Fahad Taha; Sammour, Mahmoud A. M.

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%. PMID:29672546

  13. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach.

    Science.gov (United States)

    Albadr, Musatafa Abbas Abbood; Tiun, Sabrina; Al-Dhief, Fahad Taha; Sammour, Mahmoud A M

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%.

  14. MySQL based selection of appropriate indexing technique in ...

    African Journals Online (AJOL)

    This paper deals with selection of appropriate indexing technique applied on MySQL Database for a health care system and related performance issues using multiclass support vector machine (SVM). The patient database is generally huge and contains lot of variations. For the quick search or fast retrieval of the desired ...

  15. A Qualitative Study of Knowledge Exchange in an Indonesian Machine-Making Company

    Directory of Open Access Journals (Sweden)

    Indria Handoko

    2016-08-01

    Full Text Available In a supply chain, company’s ability to leverage knowledge that resides within the network of contracted and interacting firms is able to improve not only company performance but also the supply chain effectiveness as a whole. However, existing supply chain studies mostly discuss knowledge at company level, and rarely at internal-hierarchical levels. As a result, many things remain concealed, for example, how knowledge exchange between people across levels in a supply chain is influenced by the supply chain government. Moreover, most exsisting studies focus on a rigid hierarchical supply-chain mechanism, and hardly elaborate how interactions in a less-rigid mechanism. This article attempts to address these gaps, discussing how a supplier company that deals with innovation generation activities acquires knowledge that resides in its supply chain network. A qualitative case study about an Indonesian machine-making company has been chosen to represent one of supplier types in the automotive industry that deals with less-rigid mechanism. A social capital perspective is applied to shed light on how interactions between actors in a supply chain network influence knowledge exchange. The study finds out a positive relationship between social capital and knowledge exchange across levels and functions to help generate innovations, allowing the company to manage conflicting effect beliefs more effectively. The study also identifies a tendency of the company to regard intensive knowledge exchange as part of organizational learning process.

  16. Knowledge base mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Suwa, M; Furukawa, K; Makinouchi, A; Mizoguchi, T; Mizoguchi, F; Yamasaki, H

    1982-01-01

    One of the principal goals of the Fifth Generation Computer System Project for the coming decade is to develop a methodology for building knowledge information processing systems which will provide people with intelligent agents. The key notion of the fifth generation computer system is knowledge used for problem solving. In this paper the authors describe the plan of Randd on knowledge base mechanisms. A knowledge representation system is to be designed to support knowledge acquisition for the knowledge information processing systems. The system will include a knowledge representation language, a knowledge base editor and a debugger. It is also expected to perform as a kind of meta-inference system. In order to develop the large scale knowledge base systems, a knowledge base mechanism based on the relational model is to be studied in the earlier stage of the project. Distributed problem solving is also one of the main issues of the project. 19 references.

  17. Improving the performance of DomainDiscovery of protein domain boundary assignment using inter-domain linker index

    Directory of Open Access Journals (Sweden)

    Zomaya Albert Y

    2006-12-01

    Full Text Available Abstract Background Knowledge of protein domain boundaries is critical for the characterisation and understanding of protein function. The ability to identify domains without the knowledge of the structure – by using sequence information only – is an essential step in many types of protein analyses. In this present study, we demonstrate that the performance of DomainDiscovery is improved significantly by including the inter-domain linker index value for domain identification from sequence-based information. Improved DomainDiscovery uses a Support Vector Machine (SVM approach and a unique training dataset built on the principle of consensus among experts in defining domains in protein structure. The SVM was trained using a PSSM (Position Specific Scoring Matrix, secondary structure, solvent accessibility information and inter-domain linker index to detect possible domain boundaries for a target sequence. Results Improved DomainDiscovery is compared with other methods by benchmarking against a structurally non-redundant dataset and also CASP5 targets. Improved DomainDiscovery achieves 70% accuracy for domain boundary identification in multi-domains proteins. Conclusion Improved DomainDiscovery compares favourably to the performance of other methods and excels in the identification of domain boundaries for multi-domain proteins as a result of introducing support vector machine with benchmark_2 dataset.

  18. Classification of Strawberry Fruit Shape by Machine Learning

    Science.gov (United States)

    Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; Isobe, S.; Kochi, N.

    2018-05-01

    Shape is one of the most important traits of agricultural products due to its relationships with the quality, quantity, and value of the products. For strawberries, the nine types of fruit shape were defined and classified by humans based on the sampler patterns of the nine types. In this study, we tested the classification of strawberry shapes by machine learning in order to increase the accuracy of the classification, and we introduce the concept of computerization into this field. Four types of descriptors were extracted from the digital images of strawberries: (1) the Measured Values (MVs) including the length of the contour line, the area, the fruit length and width, and the fruit width/length ratio; (2) the Ellipse Similarity Index (ESI); (3) Elliptic Fourier Descriptors (EFDs), and (4) Chain Code Subtraction (CCS). We used these descriptors for the classification test along with the random forest approach, and eight of the nine shape types were classified with combinations of MVs + CCS + EFDs. CCS is a descriptor that adds human knowledge to the chain codes, and it showed higher robustness in classification than the other descriptors. Our results suggest machine learning's high ability to classify fruit shapes accurately. We will attempt to increase the classification accuracy and apply the machine learning methods to other plant species.

  19. Resident Space Object Characterization and Behavior Understanding via Machine Learning and Ontology-based Bayesian Networks

    Science.gov (United States)

    Furfaro, R.; Linares, R.; Gaylor, D.; Jah, M.; Walls, R.

    2016-09-01

    In this paper, we present an end-to-end approach that employs machine learning techniques and Ontology-based Bayesian Networks (BN) to characterize the behavior of resident space objects. State-of-the-Art machine learning architectures (e.g. Extreme Learning Machines, Convolutional Deep Networks) are trained on physical models to learn the Resident Space Object (RSO) features in the vectorized energy and momentum states and parameters. The mapping from measurements to vectorized energy and momentum states and parameters enables behavior characterization via clustering in the features space and subsequent RSO classification. Additionally, Space Object Behavioral Ontologies (SOBO) are employed to define and capture the domain knowledge-base (KB) and BNs are constructed from the SOBO in a semi-automatic fashion to execute probabilistic reasoning over conclusions drawn from trained classifiers and/or directly from processed data. Such an approach enables integrating machine learning classifiers and probabilistic reasoning to support higher-level decision making for space domain awareness applications. The innovation here is to use these methods (which have enjoyed great success in other domains) in synergy so that it enables a "from data to discovery" paradigm by facilitating the linkage and fusion of large and disparate sources of information via a Big Data Science and Analytics framework.

  20. Machine learning systems

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, R

    1984-05-01

    With the dramatic rise of expert systems has come a renewed interest in the fuel that drives them-knowledge. For it is specialist knowledge which gives expert systems their power. But extracting knowledge from human experts in symbolic form has proved arduous and labour-intensive. So the idea of machine learning is enjoying a renaissance. Machine learning is any automatic improvement in the performance of a computer system over time, as a result of experience. Thus a learning algorithm seeks to do one or more of the following: cover a wider range of problems, deliver more accurate solutions, obtain answers more cheaply, and simplify codified knowledge. 6 references.

  1. The Role of Learning Goals in Building a Knowledge Base for Elementary Mathematics Teacher Education

    Science.gov (United States)

    Jansen, Amanda; Bartell, Tonya; Berk, Dawn

    2009-01-01

    In this article, we describe features of learning goals that enable indexing knowledge for teacher education. Learning goals are the key enabler for building a knowledge base for teacher education; they define what counts as essential knowledge for prospective teachers. We argue that 2 characteristics of learning goals support knowledge-building…

  2. Al-Quran ontology based on knowledge themes | Ta'a | Journal of ...

    African Journals Online (AJOL)

    Islamic knowledge is gathered through the understanding the Al-Quran. It requires ontology which can capture the knowledge and present it in a machine readable structured. However, current ontology approaches is irrelevant and inaccuracy in producing true concepts of Al-Quran knowledge, because it used traditional ...

  3. Knowledge Representation and Ontologies

    Science.gov (United States)

    Grimm, Stephan

    Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.

  4. Machinability of structural steels with calcium addition

    International Nuclear Information System (INIS)

    Pytel, S.; Zadecki, M.

    2003-01-01

    The machinability of the plain carbon and low alloy structural steels with carbon content of 0.1-0.6% is briefly discussed in the first part of the paper. In the experimental part a dependence between the addition of calcium and some changes in sulphide and oxide inclusions morphology is presented. The Volvo test for measurement of machinability index B i has been applied. Using the multiple regression methods two relationships between machinability index B i and stereological parameters of non-metallic inclusions as well as hardness of the steels have been calculated. The authors have reached the conclusion that owing to the changes in inclusion chemical composition and geometry as result of calcium addition the machinability index of the steel can be higher. (author)

  5. An Interactive Web-based Learning System for Assisting Machining Technology Education

    Directory of Open Access Journals (Sweden)

    Min Jou

    2008-05-01

    Full Text Available The key technique of manufacturing methods is machining. The degree of technique of machining directly affects the quality of the product. Therefore, the machining technique is of primary importance in promoting student practice ability during the training process. Currently, practical training is applied in shop floor to discipline student’s practice ability. Much time and cost are used to teach these techniques. Particularly, computerized machines are continuously increasing in use. The development of educating engineers on computerized machines becomes much more difficult than with traditional machines. This is because of the limitation of the extremely expensive cost of teaching. The quality and quantity of teaching cannot always be promoted in this respect. The traditional teaching methods can not respond well to the needs of the future. Therefore, this research aims to the following topics; (1.Propose the teaching strategies for the students to learning machining processing planning through web-based learning system. (2.Establish on-line teaching material for the computer-aided manufacturing courses including CNC coding method, CNC simulation. (3.Develop the virtual machining laboratory to bring the machining practical training to web-based learning system. (4.Integrate multi-media and virtual laboratory in the developed e-learning web-based system to enhance the effectiveness of machining education through web-based system.

  6. PDA: A coupling of knowledge and memory for case-based reasoning

    Science.gov (United States)

    Bharwani, S.; Walls, J.; Blevins, E.

    1988-01-01

    Problem solving in most domains requires reference to past knowledge and experience whether such knowledge is represented as rules, decision trees, networks or any variant of attributed graphs. Regardless of the representational form employed, designers of expert systems rarely make a distinction between the static and dynamic aspects of the system's knowledge base. The current paper clearly distinguishes between knowledge-based and memory-based reasoning where the former in its most pure sense is characterized by a static knowledge based resulting in a relatively brittle expert system while the latter is dynamic and analogous to the functions of human memory which learns from experience. The paper discusses the design of an advisory system which combines a knowledge base consisting of domain vocabulary and default dependencies between concepts with a dynamic conceptual memory which stores experimental knowledge in the form of cases. The case memory organizes past experience in the form of MOPs (memory organization packets) and sub-MOPs. Each MOP consists of a context frame and a set of indices. The context frame contains information about the features (norms) common to all the events and sub-MOPs indexed under it.

  7. Dictionary Based Machine Translation from Kannada to Telugu

    Science.gov (United States)

    Sindhu, D. V.; Sagar, B. M.

    2017-08-01

    Machine Translation is a task of translating from one language to another language. For the languages with less linguistic resources like Kannada and Telugu Dictionary based approach is the best approach. This paper mainly focuses on Dictionary based machine translation for Kannada to Telugu. The proposed methodology uses dictionary for translating word by word without much correlation of semantics between them. The dictionary based machine translation process has the following sub process: Morph analyzer, dictionary, transliteration, transfer grammar and the morph generator. As a part of this work bilingual dictionary with 8000 entries is developed and the suffix mapping table at the tag level is built. This system is tested for the children stories. In near future this system can be further improved by defining transfer grammar rules.

  8. A survey of machine readable data bases

    Science.gov (United States)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  9. Leveraging knowledge engineering and machine learning for microbial bio-manufacturing.

    Science.gov (United States)

    Oyetunde, Tolutola; Bao, Forrest Sheng; Chen, Jiung-Wen; Martin, Hector Garcia; Tang, Yinjie J

    2018-05-03

    Genome scale modeling (GSM) predicts the performance of microbial workhorses and helps identify beneficial gene targets. GSM integrated with intracellular flux dynamics, omics, and thermodynamics have shown remarkable progress in both elucidating complex cellular phenomena and computational strain design (CSD). Nonetheless, these models still show high uncertainty due to a poor understanding of innate pathway regulations, metabolic burdens, and other factors (such as stress tolerance and metabolite channeling). Besides, the engineered hosts may have genetic mutations or non-genetic variations in bioreactor conditions and thus CSD rarely foresees fermentation rate and titer. Metabolic models play important role in design-build-test-learn cycles for strain improvement, and machine learning (ML) may provide a viable complementary approach for driving strain design and deciphering cellular processes. In order to develop quality ML models, knowledge engineering leverages and standardizes the wealth of information in literature (e.g., genomic/phenomic data, synthetic biology strategies, and bioprocess variables). Data driven frameworks can offer new constraints for mechanistic models to describe cellular regulations, to design pathways, to search gene targets, and to estimate fermentation titer/rate/yield under specified growth conditions (e.g., mixing, nutrients, and O 2 ). This review highlights the scope of information collections, database constructions, and machine learning techniques (such as deep learning and transfer learning), which may facilitate "Learn and Design" for strain development. Copyright © 2018. Published by Elsevier Inc.

  10. Sensorless Suitability Analysis of Hybrid PM Machines for Electric Vehicles

    DEFF Research Database (Denmark)

    Matzen, Torben Nørregaard; Rasmussen, Peter Omand

    2009-01-01

    Electrical machines for traction in electric vehicles are an essential component which attract attention with respect to machine design and control as a part of the emerging renewable industry. For the hybrid electric machine to replace the familiar behaviour of the combustion engine torque......, control seems necessary to implement. For hybrid permanent magnet (PM) machines torque control in an indirect fashion using dq-current control is frequently done. This approach requires knowledge about the machine shaft position which may be obtained sensorless. In this article a method based on accurate...

  11. Machine learning for identifying botnet network traffic

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2013-01-01

    . Due to promise of non-invasive and resilient detection, botnet detection based on network traffic analysis has drawn a special attention of the research community. Furthermore, many authors have turned their attention to the use of machine learning algorithms as the mean of inferring botnet......-related knowledge from the monitored traffic. This paper presents a review of contemporary botnet detection methods that use machine learning as a tool of identifying botnet-related traffic. The main goal of the paper is to provide a comprehensive overview on the field by summarizing current scientific efforts....... The contribution of the paper is three-fold. First, the paper provides a detailed insight on the existing detection methods by investigating which bot-related heuristic were assumed by the detection systems and how different machine learning techniques were adapted in order to capture botnet-related knowledge...

  12. Prediction Model of Machining Failure Trend Based on Large Data Analysis

    Science.gov (United States)

    Li, Jirong

    2017-12-01

    The mechanical processing has high complexity, strong coupling, a lot of control factors in the machining process, it is prone to failure, in order to improve the accuracy of fault detection of large mechanical equipment, research on fault trend prediction requires machining, machining fault trend prediction model based on fault data. The characteristics of data processing using genetic algorithm K mean clustering for machining, machining feature extraction which reflects the correlation dimension of fault, spectrum characteristics analysis of abnormal vibration of complex mechanical parts processing process, the extraction method of the abnormal vibration of complex mechanical parts processing process of multi-component spectral decomposition and empirical mode decomposition Hilbert based on feature extraction and the decomposition results, in order to establish the intelligent expert system for the data base, combined with large data analysis method to realize the machining of the Fault trend prediction. The simulation results show that this method of fault trend prediction of mechanical machining accuracy is better, the fault in the mechanical process accurate judgment ability, it has good application value analysis and fault diagnosis in the machining process.

  13. e-Learning Content Design for Corrective Maintenance of Toshiba BMC 80.5 based on Knowledge Conversion using SECI Method: A Case Study in Aerospace Company

    Science.gov (United States)

    Permata Shabrina, Ayu; Pramuditya Soesanto, Rayinda; Kurniawati, Amelia; Teguh Kurniawan, Mochamad; Andrawina, Luciana

    2018-03-01

    Knowledge is a combination of experience, value, and information that is based on the intuition that allows an organization to evaluate and combine new information. In an organization, knowledge is not only attached to document but also in routine value creating activities, therefore knowledge is an important asset for the organization. X Corp is a company that focused on manufacturing aerospace components. In carrying out the production process, the company is supported by various machines, one of the machines is Toshiba BMC 80.5. The machine is used occasionally and therefore maintenance activity is needed, especially corrective maintenance. Corrective maintenance is done to make a breakdown machine back to work. Corrective maintenance is done by maintenance operator whose retirement year is close. The long term experience of the maintenance operator needs to be captured by the organization and shared across maintenance division. E-learning is one type of media that can support and assist knowledge sharing. This research purpose is to create the e-learning content for best practice of corrective maintenance activity for Toshiba BMC 80.5 by extracting the knowledge and experience from the operator based on knowledge conversion using SECI method. The knowledge source in this research is a maintenance supervisor and a senior maintenance engineer. From the evaluation of the e-learning content, it is known that the average test score of the respondents who use the e-learning increases from 77.5 to 87.5.

  14. Automatic Test Pattern Generator for Fuzzing Based on Finite State Machine

    Directory of Open Access Journals (Sweden)

    Ming-Hung Wang

    2017-01-01

    Full Text Available With the rapid development of the Internet, several emerging technologies are adopted to construct fancy, interactive, and user-friendly websites. Among these technologies, HTML5 is a popular one and is widely used in establishing modern sites. However, the security issues in the new web technologies are also raised and are worthy of investigation. For vulnerability investigation, many previous studies used fuzzing and focused on generation-based approaches to produce test cases for fuzzing; however, these methods require a significant amount of knowledge and mental efforts to develop test patterns for generating test cases. To decrease the entry barrier of conducting fuzzing, in this study, we propose a test pattern generation algorithm based on the concept of finite state machines. We apply graph analysis techniques to extract paths from finite state machines and use these paths to construct test patterns automatically. According to the proposal, fuzzing can be completed through inputting a regular expression corresponding to the test target. To evaluate the performance of our proposal, we conduct an experiment in identifying vulnerabilities of the input attributes in HTML5. According to the results, our approach is not only efficient but also effective for identifying weak validators in HTML5.

  15. A novel hybrid model for air quality index forecasting based on two-phase decomposition technique and modified extreme learning machine.

    Science.gov (United States)

    Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier

    2017-02-15

    The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Intelligent indexing

    International Nuclear Information System (INIS)

    Farkas, J.

    1992-01-01

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space ι 2 to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs

  17. Intelligent indexing

    Energy Technology Data Exchange (ETDEWEB)

    Farkas, J

    1993-12-31

    In this paper we discuss the relevance of artificial intelligence to the automatic indexing of natural language text. We describe the use of domain-specific semantically-based thesauruses and address the problem of creating adequate knowledge bases for intelligent indexing systems. We also discuss the relevance of the Hilbert space {iota}{sup 2} to the compact representation of documents and to the definition of the similarity of natural language texts. (author). 17 refs., 2 figs.

  18. Robot path planning using expert systems and machine vision

    Science.gov (United States)

    Malone, Denis E.; Friedrich, Werner E.

    1992-02-01

    This paper describes a system developed for the robotic processing of naturally variable products. In order to plan the robot motion path it was necessary to use a sensor system, in this case a machine vision system, to observe the variations occurring in workpieces and interpret this with a knowledge based expert system. The knowledge base was acquired by carrying out an in-depth study of the product using examination procedures not available in the robotic workplace and relates the nature of the required path to the information obtainable from the machine vision system. The practical application of this system to the processing of fish fillets is described and used to illustrate the techniques.

  19. Empirical Studies On Machine Learning Based Text Classification Algorithms

    OpenAIRE

    Shweta C. Dharmadhikari; Maya Ingle; Parag Kulkarni

    2011-01-01

    Automatic classification of text documents has become an important research issue now days. Properclassification of text documents requires information retrieval, machine learning and Natural languageprocessing (NLP) techniques. Our aim is to focus on important approaches to automatic textclassification based on machine learning techniques viz. supervised, unsupervised and semi supervised.In this paper we present a review of various text classification approaches under machine learningparadig...

  20. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    Science.gov (United States)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  1. A nanoplasmonic switch based on molecular machines

    KAUST Repository

    Zheng, Yue Bing

    2009-06-01

    We aim to develop a molecular-machine-driven nanoplasmonic switch for its use in future nanophotonic integrated circuits (ICs) that have applications in optical communication, information processing, biological and chemical sensing. Experimental data show that an Au nanodisk array, coated with rotaxane molecular machines, switches its localized surface plasmon resonances (LSPR) reversibly when it is exposed to chemical oxidants and reductants. Conversely, bare Au nanodisks and disks coated with mechanically inert control compounds, do not display the same switching behavior. Along with calculations based on time-dependent density functional theory (TDDFT), these observations suggest that the nanoscale movements within surface-bound "molecular machines" can be used as the active components in plasmonic devices. ©2009 IEEE.

  2. Research on machine learning framework based on random forest algorithm

    Science.gov (United States)

    Ren, Qiong; Cheng, Hui; Han, Hai

    2017-03-01

    With the continuous development of machine learning, industry and academia have released a lot of machine learning frameworks based on distributed computing platform, and have been widely used. However, the existing framework of machine learning is limited by the limitations of machine learning algorithm itself, such as the choice of parameters and the interference of noises, the high using threshold and so on. This paper introduces the research background of machine learning framework, and combined with the commonly used random forest algorithm in machine learning classification algorithm, puts forward the research objectives and content, proposes an improved adaptive random forest algorithm (referred to as ARF), and on the basis of ARF, designs and implements the machine learning framework.

  3. Role of Knowledge Based Communities in Knowledge Process

    Directory of Open Access Journals (Sweden)

    Sebastian Ion CEPTUREANU

    2015-12-01

    Full Text Available In the new economy, knowledge is an essential component of economic and social systems. The organizational focus has to be on building knowledge-based management, development of human resource and building intellectual capital capabilities. Knowledge-based management is defined, at company level, by economic processes that emphasize creation, selling, buying, learning, storing, developing, sharing and protection of knowledge as a decisive condition for profit and long-term sustainability of the company. Hence, knowledge is, concurently, according to a majoritiy of specialists, raw material, capital, product and an essential input. Knowledge-based communities are one of the main constituent elements of a framework for knowledge based management. These are peer networks consisting of practitioners within an organization, supporting each other to perform better through the exchange and sharing of knowledge. Some large companies have contributed or supported the establishment of numerous communities of practice, some of which may have several thousand members. They operate in different ways, are of different sizes, have different areas of interest and addresses knowledge at different levels of its maturity. This article examines the role of knowledge-based communities from the perspective of knowledge based management, given that the arrangements for organizational learning, creating, sharing, use of knowledge within organizations become more heterogeneous and take forms more difficult to predict by managers and specialists.

  4. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data.

    Science.gov (United States)

    Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam

    2016-01-01

    The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and

  5. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data

    Directory of Open Access Journals (Sweden)

    Mitchell Pesesky

    2016-11-01

    Full Text Available The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitate initial use of empiric (frequently broad-spectrum antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0% and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance

  6. BEBP: An Poisoning Method Against Machine Learning Based IDSs

    OpenAIRE

    Li, Pan; Liu, Qiang; Zhao, Wentao; Wang, Dongxu; Wang, Siqi

    2018-01-01

    In big data era, machine learning is one of fundamental techniques in intrusion detection systems (IDSs). However, practical IDSs generally update their decision module by feeding new data then retraining learning models in a periodical way. Hence, some attacks that comprise the data for training or testing classifiers significantly challenge the detecting capability of machine learning-based IDSs. Poisoning attack, which is one of the most recognized security threats towards machine learning...

  7. A Wavelet Kernel-Based Primal Twin Support Vector Machine for Economic Development Prediction

    Directory of Open Access Journals (Sweden)

    Fang Su

    2013-01-01

    Full Text Available Economic development forecasting allows planners to choose the right strategies for the future. This study is to propose economic development prediction method based on the wavelet kernel-based primal twin support vector machine algorithm. As gross domestic product (GDP is an important indicator to measure economic development, economic development prediction means GDP prediction in this study. The wavelet kernel-based primal twin support vector machine algorithm can solve two smaller sized quadratic programming problems instead of solving a large one as in the traditional support vector machine algorithm. Economic development data of Anhui province from 1992 to 2009 are used to study the prediction performance of the wavelet kernel-based primal twin support vector machine algorithm. The comparison of mean error of economic development prediction between wavelet kernel-based primal twin support vector machine and traditional support vector machine models trained by the training samples with the 3–5 dimensional input vectors, respectively, is given in this paper. The testing results show that the economic development prediction accuracy of the wavelet kernel-based primal twin support vector machine model is better than that of traditional support vector machine.

  8. Online transfer learning with extreme learning machine

    Science.gov (United States)

    Yin, Haibo; Yang, Yun-an

    2017-05-01

    In this paper, we propose a new transfer learning algorithm for online training. The proposed algorithm, which is called Online Transfer Extreme Learning Machine (OTELM), is based on Online Sequential Extreme Learning Machine (OSELM) while it introduces Semi-Supervised Extreme Learning Machine (SSELM) to transfer knowledge from the source to the target domain. With the manifold regularization, SSELM picks out instances from the source domain that are less relevant to those in the target domain to initialize the online training, so as to improve the classification performance. Experimental results demonstrate that the proposed OTELM can effectively use instances in the source domain to enhance the learning performance.

  9. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  10. e-Learning Application for Machine Maintenance Process using Iterative Method in XYZ Company

    Science.gov (United States)

    Nurunisa, Suaidah; Kurniawati, Amelia; Pramuditya Soesanto, Rayinda; Yunan Kurnia Septo Hediyanto, Umar

    2016-02-01

    XYZ Company is a company based on manufacturing part for airplane, one of the machine that is categorized as key facility in the company is Millac 5H6P. As a key facility, the machines should be assured to work well and in peak condition, therefore, maintenance process is needed periodically. From the data gathering, it is known that there are lack of competency from the maintenance staff to maintain different type of machine which is not assigned by the supervisor, this indicate that knowledge which possessed by maintenance staff are uneven. The purpose of this research is to create knowledge-based e-learning application as a realization from externalization process in knowledge transfer process to maintain the machine. The application feature are adjusted for maintenance purpose using e-learning framework for maintenance process, the content of the application support multimedia for learning purpose. QFD is used in this research to understand the needs from user. The application is built using moodle with iterative method for software development cycle and UML Diagram. The result from this research is e-learning application as sharing knowledge media for maintenance staff in the company. From the test, it is known that the application make maintenance staff easy to understand the competencies.

  11. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  12. Machine Learning Methods to Predict Diabetes Complications.

    Science.gov (United States)

    Dagliati, Arianna; Marini, Simone; Sacchi, Lucia; Cogni, Giulia; Teliti, Marsida; Tibollo, Valentina; De Cata, Pasquale; Chiovato, Luca; Bellazzi, Riccardo

    2018-03-01

    One of the areas where Artificial Intelligence is having more impact is machine learning, which develops algorithms able to learn patterns and decision rules from data. Machine learning algorithms have been embedded into data mining pipelines, which can combine them with classical statistical strategies, to extract knowledge from data. Within the EU-funded MOSAIC project, a data mining pipeline has been used to derive a set of predictive models of type 2 diabetes mellitus (T2DM) complications based on electronic health record data of nearly one thousand patients. Such pipeline comprises clinical center profiling, predictive model targeting, predictive model construction and model validation. After having dealt with missing data by means of random forest (RF) and having applied suitable strategies to handle class imbalance, we have used Logistic Regression with stepwise feature selection to predict the onset of retinopathy, neuropathy, or nephropathy, at different time scenarios, at 3, 5, and 7 years from the first visit at the Hospital Center for Diabetes (not from the diagnosis). Considered variables are gender, age, time from diagnosis, body mass index (BMI), glycated hemoglobin (HbA1c), hypertension, and smoking habit. Final models, tailored in accordance with the complications, provided an accuracy up to 0.838. Different variables were selected for each complication and time scenario, leading to specialized models easy to translate to the clinical practice.

  13. Automated valve fault detection based on acoustic emission parameters and support vector machine

    Directory of Open Access Journals (Sweden)

    Salah M. Ali

    2018-03-01

    Full Text Available Reciprocating compressors are one of the most used types of compressors with wide applications in industry. The most common failure in reciprocating compressors is always related to the valves. Therefore, a reliable condition monitoring method is required to avoid the unplanned shutdown in this category of machines. Acoustic emission (AE technique is one of the effective recent methods in the field of valve condition monitoring. However, a major challenge is related to the analysis of AE signal which perhaps only depends on the experience and knowledge of technicians. This paper proposes automated fault detection method using support vector machine (SVM and AE parameters in an attempt to reduce human intervention in the process. Experiments were conducted on a single stage reciprocating air compressor by combining healthy and faulty valve conditions to acquire the AE signals. Valve functioning was identified through AE waveform analysis. SVM faults detection model was subsequently devised and validated based on training and testing samples respectively. The results demonstrated automatic valve fault detection model with accuracy exceeding 98%. It is believed that valve faults can be detected efficiently without human intervention by employing the proposed model for a single stage reciprocating compressor. Keywords: Condition monitoring, Faults detection, Signal analysis, Acoustic emission, Support vector machine

  14. Pursuing Innovation: Benchmarking Milwaukee's Transition to a Knowledge-Based Economy. Metro Milwaukee Innovation Index 2010

    Science.gov (United States)

    Million, Laura; Dickman, Anneliese; Henken, Rob

    2010-01-01

    While the Milwaukee region's economic base is rooted in its manufacturing history, many believe that the region's future prosperity will be tied to its ability to successfully transition its economy into one that is based on knowledge and innovation. Indeed, fostering innovation has become the call to action for business and political leaders…

  15. KNOWLEDGE SOCIETY, GENERAL FRAMEWORK FOR KNOWLEDGE BASED ECONOMY

    Directory of Open Access Journals (Sweden)

    Dragos CRISTEA

    2011-03-01

    Full Text Available This paper tries to present the existent relation between knowledge society and knowledge based economy. We will identify the main pillars of knowledge society and present their importance for the development of knowledge societies. Further, we will present two perspectives over knowledge societies, respectively science and learning perspectives, that directly affects knowledge based economies. At the end, we will conclude by identifying some important questions that must be answered regarding this new social paradigm.

  16. Knowledge-based utility

    International Nuclear Information System (INIS)

    Chwalowski, M.

    1997-01-01

    This presentation provides industry examples of successful marketing practices by companies facing deregulation and competition. The common thread through the examples is that long term survival of today's utility structure is dependent on the strategic role of knowledge. As opposed to regulated monopolies which usually own huge physical assets and have very little intelligence about their customers, unregulated enterprises tend to be knowledge-based, characterized by higher market value than book value. A knowledge-based enterprise gathers data, creates information and develops knowledge by leveraging it as a competitive weapon. It institutionalizes human knowledge as a corporate asset for use over and over again by the use of databases, computer networks, patents, billing, collection and customer services (BCCS), branded interfaces and management capabilities. Activities to become knowledge-based such as replacing inventory/fixed assets with information about material usage to reduce expenditure and achieve more efficient operations, and by focusing on integration and value-adding delivery capabilities, were reviewed

  17. Adaptive Training and Collective Decision Support Based on Man-Machine Interface

    Science.gov (United States)

    2016-03-02

    Based on Man -machine Interface The views, opinions and/or findings contained in this report are those of the author(s) and should not contrued as an...ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 adaptive training, EEG, man -machine interface...non peer-reviewed journals: Final Report: Adaptive Training and Collective Decision Support Based on Man -machine Interface Report Title The existence of

  18. Machine-based mapping of innovation portfolios

    NARCIS (Netherlands)

    de Visser, Matthias; Miao, Shengfa; Englebienne, Gwenn; Sools, Anna Maria; Visscher, Klaasjan

    2017-01-01

    Machine learning techniques show a great promise for improving innovation portfolio management. In this paper we experiment with different methods to classify innovation projects of a high-tech firm as either explorative or exploitative, and compare the results with a manual, theory-based mapping of

  19. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    International Nuclear Information System (INIS)

    Brau-Avila, A; Valenzuela-Galvan, M; Herrera-Jimenez, V M; Santolaria, J; Aguilar, J J; Acero, R

    2017-01-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs. (paper)

  20. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    Science.gov (United States)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  1. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  2. New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases

    Science.gov (United States)

    Brescia, Massimo

    2012-11-01

    Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to

  3. Prototype-based models in machine learning

    NARCIS (Netherlands)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of

  4. Induction machine bearing faults detection based on a multi-dimensional MUSIC algorithm and maximum likelihood estimation.

    Science.gov (United States)

    Elbouchikhi, Elhoussin; Choqueuse, Vincent; Benbouzid, Mohamed

    2016-07-01

    Condition monitoring of electric drives is of paramount importance since it contributes to enhance the system reliability and availability. Moreover, the knowledge about the fault mode behavior is extremely important in order to improve system protection and fault-tolerant control. Fault detection and diagnosis in squirrel cage induction machines based on motor current signature analysis (MCSA) has been widely investigated. Several high resolution spectral estimation techniques have been developed and used to detect induction machine abnormal operating conditions. This paper focuses on the application of MCSA for the detection of abnormal mechanical conditions that may lead to induction machines failure. In fact, this paper is devoted to the detection of single-point defects in bearings based on parametric spectral estimation. A multi-dimensional MUSIC (MD MUSIC) algorithm has been developed for bearing faults detection based on bearing faults characteristic frequencies. This method has been used to estimate the fundamental frequency and the fault related frequency. Then, an amplitude estimator of the fault characteristic frequencies has been proposed and fault indicator has been derived for fault severity measurement. The proposed bearing faults detection approach is assessed using simulated stator currents data, issued from a coupled electromagnetic circuits approach for air-gap eccentricity emulating bearing faults. Then, experimental data are used for validation purposes. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  6. Machine Learning for Security

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Applied statistics, aka ‘Machine Learning’, offers a wealth of techniques for answering security questions. It’s a much hyped topic in the big data world, with many companies now providing machine learning as a service. This talk will demystify these techniques, explain the math, and demonstrate their application to security problems. The presentation will include how-to’s on classifying malware, looking into encrypted tunnels, and finding botnets in DNS data. About the speaker Josiah is a security researcher with HP TippingPoint DVLabs Research Group. He has over 15 years of professional software development experience. Josiah used to do AI, with work focused on graph theory, search, and deductive inference on large knowledge bases. As rules only get you so far, he moved from AI to using machine learning techniques identifying failure modes in email traffic. There followed digressions into clustered data storage and later integrated control systems. Current ...

  7. Towards Measuring the Abstractness of State Machines based on Mutation Testing

    Directory of Open Access Journals (Sweden)

    Thomas Baar

    2017-01-01

    Full Text Available Abstract. The notation of state machines is widely adopted as a formalism to describe the behaviour of systems. Usually, multiple state machine models can be developed for the very same software system. Some of these models might turn out to be equivalent, but, in many cases, different state machines describing the same system also differ in their level of abstraction. In this paper, we present an approach to actually measure the abstractness level of state machines w.r.t. a given implemented software system. A state machine is considered to be less abstract when it is conceptionally closer to the implemented system. In our approach, this distance between state machine and implementation is measured by applying coverage criteria known from software mutation testing. Abstractness of state machines can be considered as a new metric. As for other metrics as well, a known value for the abstractness of a given state machine allows to assess its quality in terms of a simple number. In model-based software development projects, the abstract metric can help to prevent model degradation since it can actually measure the semantic distance from the behavioural specification of a system in form of a state machine to the current implementation of the system. In contrast to other metrics for state machines, the abstractness cannot be statically computed based on the state machine’s structure, but requires to execute both state machine and corresponding system implementation. The article is published in the author’s wording. 

  8. An efficient flow-based botnet detection using supervised machine learning

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2014-01-01

    Botnet detection represents one of the most crucial prerequisites of successful botnet neutralization. This paper explores how accurate and timely detection can be achieved by using supervised machine learning as the tool of inferring about malicious botnet traffic. In order to do so, the paper...... introduces a novel flow-based detection system that relies on supervised machine learning for identifying botnet network traffic. For use in the system we consider eight highly regarded machine learning algorithms, indicating the best performing one. Furthermore, the paper evaluates how much traffic needs...... to accurately and timely detect botnet traffic using purely flow-based traffic analysis and supervised machine learning. Additionally, the results show that in order to achieve accurate detection traffic flows need to be monitored for only a limited time period and number of packets per flow. This indicates...

  9. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  10. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  11. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  12. Machine-Learning-Based Electronic Triage More Accurately Differentiates Patients With Respect to Clinical Outcomes Compared With the Emergency Severity Index.

    Science.gov (United States)

    Levin, Scott; Toerper, Matthew; Hamrock, Eric; Hinson, Jeremiah S; Barnes, Sean; Gardner, Heather; Dugas, Andrea; Linton, Bob; Kirsch, Tom; Kelen, Gabor

    2018-05-01

    Standards for emergency department (ED) triage in the United States rely heavily on subjective assessment and are limited in their ability to risk-stratify patients. This study seeks to evaluate an electronic triage system (e-triage) based on machine learning that predicts likelihood of acute outcomes enabling improved patient differentiation. A multisite, retrospective, cross-sectional study of 172,726 ED visits from urban and community EDs was conducted. E-triage is composed of a random forest model applied to triage data (vital signs, chief complaint, and active medical history) that predicts the need for critical care, an emergency procedure, and inpatient hospitalization in parallel and translates risk to triage level designations. Predicted outcomes and secondary outcomes of elevated troponin and lactate levels were evaluated and compared with the Emergency Severity Index (ESI). E-triage predictions had an area under the curve ranging from 0.73 to 0.92 and demonstrated equivalent or improved identification of clinical patient outcomes compared with ESI at both EDs. E-triage provided rationale for risk-based differentiation of the more than 65% of ED visits triaged to ESI level 3. Matching the ESI patient distribution for comparisons, e-triage identified more than 10% (14,326 patients) of ESI level 3 patients requiring up triage who had substantially increased risk of critical care or emergency procedure (1.7% ESI level 3 versus 6.2% up triaged) and hospitalization (18.9% versus 45.4%) across EDs. E-triage more accurately classifies ESI level 3 patients and highlights opportunities to use predictive analytics to support triage decisionmaking. Further prospective validation is needed. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  13. Detection of Watermelon Seeds Exterior Quality based on Machine Vision

    OpenAIRE

    Xiai Chen; Ling Wang; Wenquan Chen; Yanfeng Gao

    2013-01-01

    To investigate the detection of watermelon seeds exterior quality, a machine vision system based on least square support vector machine was developed. Appearance characteristics of watermelon seeds included area, perimeter, roughness, minimum enclosing rectangle and solidity were calculated by image analysis after image preprocess.The broken seeds, normal seeds and high-quality seeds were distinguished by least square support vector machine optimized by genetic algorithm. Compared to the grid...

  14. Logical Characterisation of Concept Transformations from Human into Machine relying on Predicate Logic

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    Providing more human-like concept learning in machines has always been one of the most significant goals of machine learning paradigms and of human-machine interaction techniques. This article attempts to provide a logical specification of conceptual mappings from humans’ minds into machines......’ knowledge bases. I will focus on the representation of the mappings (transformations) relying on First-Order Predicate Logic. Additionally, the structure of concepts in the common ground between humans and machines will be analysed. It seems quite necessary to pay attention to the philosophy...

  15. Machine Shop Grinding Machines.

    Science.gov (United States)

    Dunn, James

    This curriculum manual is one in a series of machine shop curriculum manuals intended for use in full-time secondary and postsecondary classes, as well as part-time adult classes. The curriculum can also be adapted to open-entry, open-exit programs. Its purpose is to equip students with basic knowledge and skills that will enable them to enter the…

  16. An IoT Knowledge Reengineering Framework for Semantic Knowledge Analytics for BI-Services

    Directory of Open Access Journals (Sweden)

    Nilamadhab Mishra

    2015-01-01

    Full Text Available In a progressive business intelligence (BI environment, IoT knowledge analytics are becoming an increasingly challenging problem because of rapid changes of knowledge context scenarios along with increasing data production scales with business requirements that ultimately transform a working knowledge base into a superseded state. Such a superseded knowledge base lacks adequate knowledge context scenarios, and the semantics, rules, frames, and ontology contents may not meet the latest requirements of contemporary BI-services. Thus, reengineering a superseded knowledge base into a renovated knowledge base system can yield greater business value and is more cost effective and feasible than standardising a new system for the same purpose. Thus, in this work, we propose an IoT knowledge reengineering framework (IKR framework for implementation in a neurofuzzy system to build, organise, and reuse knowledge to provide BI-services to the things (man, machines, places, and processes involved in business through the network of IoT objects. The analysis and discussion show that the IKR framework can be well suited to creating improved anticipation in IoT-driven BI-applications.

  17. The Sea Ice Index: A Resource for Cryospheric Knowledge Mobilization

    Science.gov (United States)

    Windnagel, A. K.; Fetterer, F. M.

    2017-12-01

    The Sea Ice Index is a popular source of information about Arctic and Antarctic sea ice data and trends created at the National Snow and Ice Data Center (NSIDC) in 2002. It has been used by cryospheric scientists, cross-discipline scientists, the press, policy makers, and the public for the past 15 years. The Index started as a prototype sea ice extent product in 2001 and was envisioned as a website that would meet a need for readily accessible, easy-to-use information on sea ice trends and anomalies, with products that would assist in monitoring and diagnosing the ice extent minima that were gaining increasing attention in the research community in the late 1990s. The goal was to easily share these valuable data with everyone that needed them, which is the essence of knowledge mobilization. As time has progressed, we have found new ways of disseminating the information carried by the data by providing simple pictures on a website, animating those images, creating Google Earth animations that show the data on a globe, providing simple text files of data values that do not require special software to read, writing a monthly blog about the data that has over 1.7 million readers annually, providing the data to NOAA's Science on Sphere to be seen in museums and classrooms across 23 countries, and creating geo-registered images for use in geospatial software. The Index helps to bridge the gap between sea ice science and the public. Through NSIDC's User Services Office, we receive feedback on the Index and have endeavored to meet the changing needs of our stakeholder communities to best mobilize this knowledge in their direction. We have learned through trial-by-fire the best practices for delivering these data and data services. This tells the tale of managing an unassuming data set as it has journeyed from a simple product consisting of images of sea ice to one that is robust enough to be used in the IPCC Climate Change Report but easy enough to be understood by K-12

  18. Effects of cutting parameters on machinability characteristics of Ni-based superalloys: a review

    Directory of Open Access Journals (Sweden)

    Kaya Eren

    2017-12-01

    Full Text Available Nickel based superalloys offer high strength, corrosion resistance, thermal stability and superb thermal fatigue properties. However, they have been one of the most difficult materials to machine due to these properties. Although we are witnessing improved machining strategies with the developing machining, tooling and inspection technologies, machining of nickel based superalloys is still a challenging task due to in-process strains and post process part quality demands.

  19. The knowledge base of journalism

    DEFF Research Database (Denmark)

    Svith, Flemming

    In this paper I propose the knowledge base as a fruitful way to apprehend journalism. With the claim that the majority of practice is anchored in knowledge – understood as 9 categories of rationales, forms and levels – this knowledge base appears as a contextual look at journalists’ knowledge......, and place. As an analytical framework, the knowledge base is limited to understand the practice of newspaper journalists, but, conversely, the knowledge base encompasses more general beginnings through the inclusion of overall structural relationships in the media and journalism and general theories...... on practice and knowledge. As the result of an abductive reasoning is a theory proposal, there is a need for more deductive approaches to test the validity of this knowledge base claim. It is thus relevant to investigate which rationales are included in the knowledge base of journalism, as the dimension does...

  20. Student Modeling and Machine Learning

    OpenAIRE

    Sison , Raymund; Shimura , Masamichi

    1998-01-01

    After identifying essential student modeling issues and machine learning approaches, this paper examines how machine learning techniques have been used to automate the construction of student models as well as the background knowledge necessary for student modeling. In the process, the paper sheds light on the difficulty, suitability and potential of using machine learning for student modeling processes, and, to a lesser extent, the potential of using student modeling techniques in machine le...

  1. A Qualitative Study of Knowledge Exchange in an Indonesian Machine-Making Company (P.75-92

    Directory of Open Access Journals (Sweden)

    Indria Handoko

    2017-02-01

    Full Text Available In a supply chain, company’s ability to leverage knowledge that resides within the network of contracted and interacting firms is able to improve not only company performance but also the supply chain effectiveness as a whole. However, existing supply chain studies mostly discuss knowledge at company level, and rarely at internal-hierarchical levels. As a result, many things remain concealed, for example, how knowledge exchange between people across levels in a supply chain is influenced by the supply chain government. Moreover, most exsisting studies focus on a rigid hierarchical supply-chain mechanism, and hardly elaborate how  interactions in a less-rigid mechanism. This article attempts to address these gaps, discussing how a supplier company that deals with innovation generation activities acquires knowledge that resides in its supply chain network. A qualitative case study about an Indonesian machine-making company has been chosen to represent one of supplier types in the automotive industry that deals with less-rigid mechanism. A social capital perspective is applied to shed light on how interactions between actors in a supply chain network influence knowledge exchange. The study finds out a positive relationship between social capital and knowledge exchange across levels and functions to help generate innovations, allowing the company to manage conflicting effect beliefs more effectively. The study also identifies a tendency of the company to regard intensive knowledge exchange as part of organizational learning process. Keywords: social capital, knowledge exchange supply chain, qualitative study

  2. Sensorless Control of Permanent Magnet Synchronous Machines

    DEFF Research Database (Denmark)

    Matzen, Torben N.

    Permanent magnet machines, with either surface mounted or embedded magnets on the rotor, are becoming more common due to the key advantages of higher energy conversion efficiency and higher torque density compared to the classical induction machine. Besides energy efficiency the permanent magnet...... the synchronous machine requires knowledge of the rotor shaft position due to the synchronous and undamped nature of the machine. The rotor position may be measured using a mechanical sensor, but the sensor reduces reliability and adds cost to the system and for this reason sensorless control methods started...... are dependent on the phase currents and rotor position. Based on the flux linkages the differential inductances are determined and used to establish the inductance saliency in terms of ratio and orientation. The orientation and its dependence on the current and rotor position are used to analyse the behaviour...

  3. Knowledge Acquisition Using Linguistic-Based Knowledge Analysis

    Science.gov (United States)

    Daniel L. Schmoldt

    1998-01-01

    Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...

  4. Passivity-Based Control of Electric Machines

    Energy Technology Data Exchange (ETDEWEB)

    Nicklasson, P.J.

    1996-12-31

    This doctoral thesis presents new results on the design and analysis of controllers for a class of electric machines. Nonlinear controllers are derived from a Lagrangian model representation using passivity techniques, and previous results on induction motors are improved and extended to Blondel-Park transformable machines. The relation to conventional techniques is discussed, and it is shown that the formalism introduced in this work facilitates analysis of conventional methods, so that open questions concerning these methods may be resolved. In addition, the thesis contains the following improvements of previously published results on the control of induction motors: (1) Improvement of a passivity-based speed/position controller, (2) Extension of passivity-based (observer-less and observer-based) controllers from regulation to tracking of rotor flux norm, (3) An extension of the classical indirect FOC (Field-Oriented Control) scheme to also include global rotor flux norm tracking, instead of only torque tracking and rotor flux norm regulation. The design is illustrated experimentally by applying the proposed control schemes to a squirrel-cage induction motor. The results show that the proposed methods have advantages over previous designs with respect to controller tuning, performance and robustness. 145 refs., 21 figs.

  5. Machinability of a Stainless Steel by Electrochemical Discharge Microdrilling

    International Nuclear Information System (INIS)

    Coteata, Margareta; Pop, Nicolae; Slatineanu, Laurentiu; Schulze, Hans-Peter; Besliu, Irina

    2011-01-01

    Due to the chemical elements included in their structure for ensuring an increased resistance to the environment action, the stainless steels are characterized by a low machinability when classical machining methods are applied. For this reason, sometimes non-traditional machining methods are applied, one of these being the electrochemical discharge machining. To obtain microholes and to evaluate the machinability by electrochemical discharge microdrilling, test pieces of stainless steel were used for experimental research. The electrolyte was an aqueous solution of sodium silicate with different densities. A complete factorial plan was designed to highlight the influence of some input variables on the sizes of the considered machinability indexes (electrode tool wear, material removal rate, depth of the machined hole). By mathematically processing of experimental data, empirical functions were established both for stainless steel and carbon steel. Graphical representations were used to obtain more suggestive vision concerning the influence exerted by the considered input variables on the size of the machinability indexes.

  6. Competency Assessment in Family Medicine Residency: Observations, Knowledge-Based Examinations, and Advancement.

    Science.gov (United States)

    Mainous, Arch G; Fang, Bo; Peterson, Lars E

    2017-12-01

    The Family Medicine (FM) Milestones are competency-based assessments of residents in key dimensions relevant to practice in the specialty. Residency programs use the milestones in semiannual reviews of resident performance from the time of entry into the program to graduation. Using a national sample, we investigated the relationship of FM competency-based assessments to resident progress and the complementarity of milestones with knowledge-based assessments in FM residencies. We used midyear and end-of-year milestone ratings for all FM residents in Accreditation Council for Graduate Medical Education-accredited programs during academic years 2014-2015 and 2015-2016. The milestones contain 22 items across 6 competencies. We created a summative index across the milestones. The American Board of Family Medicine database provided resident demographics and in-training examination (ITE) scores. We linked information to the milestone data. The sample encompassed 6630 FM residents. The summative milestone index increased, on average, for each cohort (postgraduate year 1 [PGY-1] to PGY-2 and PGY-2 to PGY-3) at each assessment. The correlation between the milestone index that excluded the medical knowledge milestone and ITE scores was r  = .195 ( P  ITE scores and composite milestone assessments were higher for residents who advanced than for those who did not. Competency-based assessment using the milestones for FM residents seems to be a viable multidimensional tool to assess the successful progression of residents.

  7. Data preparation for municipal virtual assistant using machine learning

    OpenAIRE

    Jovan, Leon Noe

    2016-01-01

    The main goal of this master’s thesis was to develop a procedure that will automate the construction of the knowledge base for a virtual assistant that answers questions about municipalities in Slovenia. The aim of the procedure is to replace or facilitate manual preparation of the virtual assistant's knowledge base. Theoretical backgrounds of different machine learning fields, such as multilabel classification, text mining and learning from weakly labeled data were examined to gain a better ...

  8. Foundation: Transforming data bases into knowledge bases

    Science.gov (United States)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  9. IoT Security Techniques Based on Machine Learning

    OpenAIRE

    Xiao, Liang; Wan, Xiaoyue; Lu, Xiaozhen; Zhang, Yanyong; Wu, Di

    2018-01-01

    Internet of things (IoT) that integrate a variety of devices into networks to provide advanced and intelligent services have to protect user privacy and address attacks such as spoofing attacks, denial of service attacks, jamming and eavesdropping. In this article, we investigate the attack model for IoT systems, and review the IoT security solutions based on machine learning techniques including supervised learning, unsupervised learning and reinforcement learning. We focus on the machine le...

  10. Machine vision based quality inspection of flat glass products

    Science.gov (United States)

    Zauner, G.; Schagerl, M.

    2014-03-01

    This application paper presents a machine vision solution for the quality inspection of flat glass products. A contact image sensor (CIS) is used to generate digital images of the glass surfaces. The presented machine vision based quality inspection at the end of the production line aims to classify five different glass defect types. The defect images are usually characterized by very little `image structure', i.e. homogeneous regions without distinct image texture. Additionally, these defect images usually consist of only a few pixels. At the same time the appearance of certain defect classes can be very diverse (e.g. water drops). We used simple state-of-the-art image features like histogram-based features (std. deviation, curtosis, skewness), geometric features (form factor/elongation, eccentricity, Hu-moments) and texture features (grey level run length matrix, co-occurrence matrix) to extract defect information. The main contribution of this work now lies in the systematic evaluation of various machine learning algorithms to identify appropriate classification approaches for this specific class of images. In this way, the following machine learning algorithms were compared: decision tree (J48), random forest, JRip rules, naive Bayes, Support Vector Machine (multi class), neural network (multilayer perceptron) and k-Nearest Neighbour. We used a representative image database of 2300 defect images and applied cross validation for evaluation purposes.

  11. Classification of follicular lymphoma images: a holistic approach with symbol-based machine learning methods.

    Science.gov (United States)

    Zorman, Milan; Sánchez de la Rosa, José Luis; Dinevski, Dejan

    2011-12-01

    It is not very often to see a symbol-based machine learning approach to be used for the purpose of image classification and recognition. In this paper we will present such an approach, which we first used on the follicular lymphoma images. Lymphoma is a broad term encompassing a variety of cancers of the lymphatic system. Lymphoma is differentiated by the type of cell that multiplies and how the cancer presents itself. It is very important to get an exact diagnosis regarding lymphoma and to determine the treatments that will be most effective for the patient's condition. Our work was focused on the identification of lymphomas by finding follicles in microscopy images provided by the Laboratory of Pathology in the University Hospital of Tenerife, Spain. We divided our work in two stages: in the first stage we did image pre-processing and feature extraction, and in the second stage we used different symbolic machine learning approaches for pixel classification. Symbolic machine learning approaches are often neglected when looking for image analysis tools. They are not only known for a very appropriate knowledge representation, but also claimed to lack computational power. The results we got are very promising and show that symbolic approaches can be successful in image analysis applications.

  12. Using Linguistic Knowledge in Statistical Machine Translation

    Science.gov (United States)

    2010-09-01

    reproduced in (Belnap and Haeri, 1997)), a sociolinguistic phenomenon where the literary standard differs considerably from the vernacular varieties...Machine Translation Summit (MT-Summit). N. Haeri. 2000. Form and ideology: Arabic sociolinguistics and beyond. Annual Review of Anthropology, 29. D. Hakkani

  13. Model-based formalization of medical knowledge for context-aware assistance in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes G.; Müller-Stich, Beat P.; Dillmann, Rüdiger; Speidel, Stefanie

    2014-03-01

    The increase of technological complexity in surgery has created a need for novel man-machine interaction techniques. Specifically, context-aware systems which automatically adapt themselves to the current circumstances in the OR have great potential in this regard. To create such systems, models of surgical procedures are vital, as they allow analyzing the current situation and assessing the context. For this purpose, we have developed a Surgical Process Model based on Description Logics. It incorporates general medical background knowledge as well as intraoperatively observed situational knowledge. The representation consists of three parts: the Background Knowledge Model, the Preoperative Process Model and the Integrated Intraoperative Process Model. All models depend on each other and create a concise view on the surgery. As a proof of concept, we applied the system to a specific intervention, the laparoscopic distal pancreatectomy.

  14. One knowledge base or many knowledge pools?

    DEFF Research Database (Denmark)

    Lundvall, Bengt-Åke

    It is increasingly realized that knowledge is the most important resource and that learning is the most important process in the economy. Sometimes this is expressed by coining the current era as characterised by a ‘knowledge based economy'. But this concept might be misleading by indicating...... that there is one common knowledge base on which economic activities can be built. In this paper we argue that it is more appropriate to see the economy as connecting to different ‘pools of knowledge'. The argument is built upon a conceptual framework where we make distinctions between private/public, local....../global, individual/collective and tacit/codified knowledge. The purpose is both ‘academic' and practical. Our analysis demonstrates the limits of a narrowly economic perspective on knowledge and we show that these distinctions have important implications both for innovation policy and for management of innovation....

  15. Experimental investigation of the tip based micro/nano machining

    Science.gov (United States)

    Guo, Z.; Tian, Y.; Liu, X.; Wang, F.; Zhou, C.; Zhang, D.

    2017-12-01

    Based on the self-developed three dimensional micro/nano machining system, the effects of machining parameters and sample material on micro/nano machining are investigated. The micro/nano machining system is mainly composed of the probe system and micro/nano positioning stage. The former is applied to control the normal load and the latter is utilized to realize high precision motion in the xy plane. A sample examination method is firstly introduced to estimate whether the sample is placed horizontally. The machining parameters include scratching direction, speed, cycles, normal load and feed. According to the experimental results, the scratching depth is significantly affected by the normal load in all four defined scratching directions but is rarely influenced by the scratching speed. The increase of scratching cycle number can increase the scratching depth as well as smooth the groove wall. In addition, the scratching tests of silicon and copper attest that the harder material is easier to be removed. In the scratching with different feed amount, the machining results indicate that the machined depth increases as the feed reduces. Further, a cubic polynomial is used to fit the experimental results to predict the scratching depth. With the selected machining parameters of scratching direction d3/d4, scratching speed 5 μm/s and feed 0.06 μm, some more micro structures including stair, sinusoidal groove, Chinese character '田', 'TJU' and Chinese panda have been fabricated on the silicon substrate.

  16. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz; Husain, Iqbal; Muljadi, Eduard

    2017-02-16

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variables that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.

  17. Sensorless Characteristics of Hybrid PM Machines at Zero and Low Speed

    DEFF Research Database (Denmark)

    Matzen, Torben N.; Rasmussen, Peter Omand

    2009-01-01

    Sensorless methods for zero and low speed operation in drives with hybrid PM machines make use of the machine saliency to determine the rotor position in an indirect fashion. When integrating the position measurement in the electrical power supply to the machine, i.e. make the machine self......-sensing, the sensorless obtained position can be affected by the actual operation conditions of the machine e.g. the stator currents. This may deteriorate the machine self-sensing suitability using injection methods. In this paper an analysis method based on accurate knowledge of the machine flux linkages is proposed...... for analysing the suitability for sensorless control at zero and low speed. The method can be used to evaluate a particular machine design so the self-sensing characteristics for sensorless control of machine can be found. The characteristics can be obtained from finite element simulation data or experimental...

  18. Knowledge representation and knowledge base design for operator advisor system

    International Nuclear Information System (INIS)

    Hangos, K.M.; Sziano, T.; Tapolcai, L.

    1990-01-01

    The problems of knowledge representation, knowledge base handling and design has been described for an Operator Advisor System in the Paks Nuclear Power Plant. The Operator Advisor System is to be implemented as a part of the 5th and 6th unit. The knowledge of the Operator Advisor system is described by a few elementary knowledge items (diagnostic event functions, fault graph, action trees), weighted directed graphs have been found as their common structure. List-type and relational representation of these graphs have been used for the on-line and off-line part of the knowledge base respectively. A uniform data base design and handling has been proposed which consists of a design system, a knowledge base editor and a knowledge base compiler

  19. Machine learning for network-based malware detection

    DEFF Research Database (Denmark)

    Stevanovic, Matija

    and based on different, mutually complementary, principles of traffic analysis. The proposed approaches rely on machine learning algorithms (MLAs) for automated and resource-efficient identification of the patterns of malicious network traffic. We evaluated the proposed methods through extensive evaluations...

  20. Machine learning-based methods for prediction of linear B-cell epitopes.

    Science.gov (United States)

    Wang, Hsin-Wei; Pai, Tun-Wen

    2014-01-01

    B-cell epitope prediction facilitates immunologists in designing peptide-based vaccine, diagnostic test, disease prevention, treatment, and antibody production. In comparison with T-cell epitope prediction, the performance of variable length B-cell epitope prediction is still yet to be satisfied. Fortunately, due to increasingly available verified epitope databases, bioinformaticians could adopt machine learning-based algorithms on all curated data to design an improved prediction tool for biomedical researchers. Here, we have reviewed related epitope prediction papers, especially those for linear B-cell epitope prediction. It should be noticed that a combination of selected propensity scales and statistics of epitope residues with machine learning-based tools formulated a general way for constructing linear B-cell epitope prediction systems. It is also observed from most of the comparison results that the kernel method of support vector machine (SVM) classifier outperformed other machine learning-based approaches. Hence, in this chapter, except reviewing recently published papers, we have introduced the fundamentals of B-cell epitope and SVM techniques. In addition, an example of linear B-cell prediction system based on physicochemical features and amino acid combinations is illustrated in details.

  1. Hippocampome.org: a knowledge base of neuron types in the rodent hippocampus.

    Science.gov (United States)

    Wheeler, Diek W; White, Charise M; Rees, Christopher L; Komendantov, Alexander O; Hamilton, David J; Ascoli, Giorgio A

    2015-09-24

    Hippocampome.org is a comprehensive knowledge base of neuron types in the rodent hippocampal formation (dentate gyrus, CA3, CA2, CA1, subiculum, and entorhinal cortex). Although the hippocampal literature is remarkably information-rich, neuron properties are often reported with incompletely defined and notoriously inconsistent terminology, creating a formidable challenge for data integration. Our extensive literature mining and data reconciliation identified 122 neuron types based on neurotransmitter, axonal and dendritic patterns, synaptic specificity, electrophysiology, and molecular biomarkers. All ∼3700 annotated properties are individually supported by specific evidence (∼14,000 pieces) in peer-reviewed publications. Systematic analysis of this unprecedented amount of machine-readable information reveals novel correlations among neuron types and properties, the potential connectivity of the full hippocampal circuitry, and outstanding knowledge gaps. User-friendly browsing and online querying of Hippocampome.org may aid design and interpretation of both experiments and simulations. This powerful, simple, and extensible neuron classification endeavor is unique in its detail, utility, and completeness.

  2. Combined prediction model for supply risk in nuclear power equipment manufacturing industry based on support vector machine and decision tree

    International Nuclear Information System (INIS)

    Shi Chunsheng; Meng Dapeng

    2011-01-01

    The prediction index for supply risk is developed based on the factor identifying of nuclear equipment manufacturing industry. The supply risk prediction model is established with the method of support vector machine and decision tree, based on the investigation on 3 important nuclear power equipment manufacturing enterprises and 60 suppliers. Final case study demonstrates that the combination model is better than the single prediction model, and demonstrates the feasibility and reliability of this model, which provides a method to evaluate the suppliers and measure the supply risk. (authors)

  3. Performance of machine learning methods for ligand-based virtual screening.

    Science.gov (United States)

    Plewczynski, Dariusz; Spieser, Stéphane A H; Koch, Uwe

    2009-05-01

    Computational screening of compound databases has become increasingly popular in pharmaceutical research. This review focuses on the evaluation of ligand-based virtual screening using active compounds as templates in the context of drug discovery. Ligand-based screening techniques are based on comparative molecular similarity analysis of compounds with known and unknown activity. We provide an overview of publications that have evaluated different machine learning methods, such as support vector machines, decision trees, ensemble methods such as boosting, bagging and random forests, clustering methods, neuronal networks, naïve Bayesian, data fusion methods and others.

  4. Development of AI (Artificial Intelligence)-based simulation system for man-machine system behavior in accidental situations of nuclear power plant

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Yokobayashi, Masao; Tanabe, Fumiya; Kawase, Katumi.

    1996-01-01

    A prototype version of a computer simulation system named JACOS (JAeri COgnitive Simulation system) has been developed at JAERI (Japan Atomic Energy Research Institute) to simulate the man-machine system behavior in which both the cognitive behavior of a human operator and the plant behavior affect each other. The objectives of this system development is to provide man-machine system analysts with detailed information on the cognitive process of an operator and the plant behavior affected by operator's actions in accidental situations of an NPP (nuclear power plant). The simulation system consists of an operator model and a plant model which are coupled dynamically. The operator model simulates an operator's cognitive behavior in accidental situations based on the decision ladder model of Rasmussen, and is implemented using the AI-techniques of the distributed cooperative inference method with the so-called blackboard architecture. Rule-based behavior is simulated using knowledge representation with If-Then type of rules. Knowledge-based behavior is simulated using knowledge representation with MFM (Multilevel Flow Modeling) and qualitative reasoning method. Cognitive characteristics of attentional narrowing, limitation of short-term memory, and knowledge recalling from long-term memory are also modeled. The plant model of a 3-loop PWR is also developed using a best estimate thermal-hydraulic analysis code RELAP5/MOD2. Some simulations of incidents were performed to verify the human model. It was found that AI-techniques used in the human model are suitable to simulate the operator's cognitive behavior in an NPP accident. The models of cognitive characteristics were investigated in the effects on simulated results of cognitive behaviors. (author)

  5. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    Science.gov (United States)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  6. Fuzzy knowledge bases integration based on ontology

    OpenAIRE

    Ternovoy, Maksym; Shtogrina, Olena

    2012-01-01

    the paper describes the approach for fuzzy knowledge bases integration with the usage of ontology. This approach is based on metadata-base usage for integration of different knowledge bases with common ontology. The design process of metadata-base is described.

  7. Knowledge-based Telecom Industry

    OpenAIRE

    Vinje, Villeman; Nordkvelde, Marius

    2011-01-01

    BI Norwegian School of Management is conducting a national research project entitled “A knowledge-based Norway”. Thirteen major knowledge-based industries in Norway are being analyzed under the auspices of the project. This study assesses the underlying properties of a global knowledge hub to examine the extent to which the Norwegian telecom industry – which encompasses all telecom firms located in Norway regardless of ownership – constitutes a global knowledge hub. It commences with a ge...

  8. An Android malware detection system based on machine learning

    Science.gov (United States)

    Wen, Long; Yu, Haiyang

    2017-08-01

    The Android smartphone, with its open source character and excellent performance, has attracted many users. However, the convenience of the Android platform also has motivated the development of malware. The traditional method which detects the malware based on the signature is unable to detect unknown applications. The article proposes a machine learning-based lightweight system that is capable of identifying malware on Android devices. In this system we extract features based on the static analysis and the dynamitic analysis, then a new feature selection approach based on principle component analysis (PCA) and relief are presented in the article to decrease the dimensions of the features. After that, a model will be constructed with support vector machine (SVM) for classification. Experimental results show that our system provides an effective method in Android malware detection.

  9. Distance based control system for machine vision-based selective spraying

    NARCIS (Netherlands)

    Steward, B.L.; Tian, L.F.; Tang, L.

    2002-01-01

    For effective operation of a selective sprayer with real-time local weed sensing, herbicides must be delivered, accurately to weed targets in the field. With a machine vision-based selective spraying system, acquiring sequential images and switching nozzles on and off at the correct locations are

  10. Voice based gender classification using machine learning

    Science.gov (United States)

    Raahul, A.; Sapthagiri, R.; Pankaj, K.; Vijayarajan, V.

    2017-11-01

    Gender identification is one of the major problem speech analysis today. Tracing the gender from acoustic data i.e., pitch, median, frequency etc. Machine learning gives promising results for classification problem in all the research domains. There are several performance metrics to evaluate algorithms of an area. Our Comparative model algorithm for evaluating 5 different machine learning algorithms based on eight different metrics in gender classification from acoustic data. Agenda is to identify gender, with five different algorithms: Linear Discriminant Analysis (LDA), K-Nearest Neighbour (KNN), Classification and Regression Trees (CART), Random Forest (RF), and Support Vector Machine (SVM) on basis of eight different metrics. The main parameter in evaluating any algorithms is its performance. Misclassification rate must be less in classification problems, which says that the accuracy rate must be high. Location and gender of the person have become very crucial in economic markets in the form of AdSense. Here with this comparative model algorithm, we are trying to assess the different ML algorithms and find the best fit for gender classification of acoustic data.

  11. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2013-01-01

    Written as a tutorial to explore and understand the power of R for machine learning. This practical guide that covers all of the need to know topics in a very systematic way. For each machine learning approach, each step in the process is detailed, from preparing the data for analysis to evaluating the results. These steps will build the knowledge you need to apply them to your own data science tasks.Intended for those who want to learn how to use R's machine learning capabilities and gain insight from your data. Perhaps you already know a bit about machine learning, but have never used R; or

  12. Simulation and Community-Based Instruction of Vending Machines with Time Delay.

    Science.gov (United States)

    Browder, Diane M.; And Others

    1988-01-01

    The study evaluated the use of simulated instruction on vending machine use as an adjunct to community-based instruction with two moderately retarded children. Results showed concurrent acquisition of the vending machine skills across trained and untrained sites. (Author/DB)

  13. Information Society and Knowledge Economy - Essence and Key Relationships

    Directory of Open Access Journals (Sweden)

    Rafał Żelazny

    2015-04-01

    Full Text Available This paper focuses on essence and relationships between information society (IS and knowledge economy (KE concepts. The aim of this article is twofold. The first objective is to denominate the conceptual framework and relationships between IS and KE conceptions. The second is to present dependencies between the indexes of IS and KE development level in selected countries. Firstly, based on the notional relations between information and knowledge, there are characterized the relationships between concepts of information society, knowledge economy and knowledge society (KS. Secondly, using popular composite indexes evaluating the degree of IS and KE development i.e. Networked Readiness Index (NRI, ICT Development Index (IDI, Knowledge Economy Index (KEI and Summary Innovation Index (SII, there were studied corelations between information society and knowledge economy in 34 selected countries in 2012. The paper concludes by stating limits and implications for further research. This work contributes to systematization and integration of knowledge about the mutually permeable conceptions of information society and knowledge economy

  14. Machine Learning Based Localization and Classification with Atomic Magnetometers

    Science.gov (United States)

    Deans, Cameron; Griffin, Lewis D.; Marmugi, Luca; Renzoni, Ferruccio

    2018-01-01

    We demonstrate identification of position, material, orientation, and shape of objects imaged by a Rb 85 atomic magnetometer performing electromagnetic induction imaging supported by machine learning. Machine learning maximizes the information extracted from the images created by the magnetometer, demonstrating the use of hidden data. Localization 2.6 times better than the spatial resolution of the imaging system and successful classification up to 97% are obtained. This circumvents the need of solving the inverse problem and demonstrates the extension of machine learning to diffusive systems, such as low-frequency electrodynamics in media. Automated collection of task-relevant information from quantum-based electromagnetic imaging will have a relevant impact from biomedicine to security.

  15. Machine Fault Detection Based on Filter Bank Similarity Features Using Acoustic and Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Mauricio Holguín-Londoño

    2016-01-01

    Full Text Available Vibration and acoustic analysis actively support the nondestructive and noninvasive fault diagnostics of rotating machines at early stages. Nonetheless, the acoustic signal is less used because of its vulnerability to external interferences, hindering an efficient and robust analysis for condition monitoring (CM. This paper presents a novel methodology to characterize different failure signatures from rotating machines using either acoustic or vibration signals. Firstly, the signal is decomposed into several narrow-band spectral components applying different filter bank methods such as empirical mode decomposition, wavelet packet transform, and Fourier-based filtering. Secondly, a feature set is built using a proposed similarity measure termed cumulative spectral density index and used to estimate the mutual statistical dependence between each bandwidth-limited component and the raw signal. Finally, a classification scheme is carried out to distinguish the different types of faults. The methodology is tested in two laboratory experiments, including turbine blade degradation and rolling element bearing faults. The robustness of our approach is validated contaminating the signal with several levels of additive white Gaussian noise, obtaining high-performance outcomes that make the usage of vibration, acoustic, and vibroacoustic measurements in different applications comparable. As a result, the proposed fault detection based on filter bank similarity features is a promising methodology to implement in CM of rotating machinery, even using measurements with low signal-to-noise ratio.

  16. A knowledge base architecture for distributed knowledge agents

    Science.gov (United States)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  17. Tattoo machines, needles and utilities.

    Science.gov (United States)

    Rosenkilde, Frank

    2015-01-01

    Starting out as a professional tattooist back in 1977 in Copenhagen, Denmark, Frank Rosenkilde has personally experienced the remarkable development of tattoo machines, needles and utilities: all the way from home-made equipment to industrial products of substantially improved quality. Machines can be constructed like the traditional dual-coil and single-coil machines or can be e-coil, rotary and hybrid machines, with the more convenient and precise rotary machines being the recent trend. This development has resulted in disposable needles and utilities. Newer machines are more easily kept clean and protected with foil to prevent crosscontaminations and infections. The machines and the tattooists' knowledge and awareness about prevention of infection have developed hand-in-hand. For decades, Frank Rosenkilde has been collecting tattoo machines. Part of his collection is presented here, supplemented by his personal notes. © 2015 S. Karger AG, Basel.

  18. Towards a Standard-based Domain-specific Platform to Solve Machine Learning-based Problems

    Directory of Open Access Journals (Sweden)

    Vicente García-Díaz

    2015-12-01

    Full Text Available Machine learning is one of the most important subfields of computer science and can be used to solve a variety of interesting artificial intelligence problems. There are different languages, framework and tools to define the data needed to solve machine learning-based problems. However, there is a great number of very diverse alternatives which makes it difficult the intercommunication, portability and re-usability of the definitions, designs or algorithms that any developer may create. In this paper, we take the first step towards a language and a development environment independent of the underlying technologies, allowing developers to design solutions to solve machine learning-based problems in a simple and fast way, automatically generating code for other technologies. That can be considered a transparent bridge among current technologies. We rely on Model-Driven Engineering approach, focusing on the creation of models to abstract the definition of artifacts from the underlying technologies.

  19. Knowledge-Based Reinforcement Learning for Data Mining

    Science.gov (United States)

    Kudenko, Daniel; Grzes, Marek

    Data Mining is the process of extracting patterns from data. Two general avenues of research in the intersecting areas of agents and data mining can be distinguished. The first approach is concerned with mining an agent’s observation data in order to extract patterns, categorize environment states, and/or make predictions of future states. In this setting, data is normally available as a batch, and the agent’s actions and goals are often independent of the data mining task. The data collection is mainly considered as a side effect of the agent’s activities. Machine learning techniques applied in such situations fall into the class of supervised learning. In contrast, the second scenario occurs where an agent is actively performing the data mining, and is responsible for the data collection itself. For example, a mobile network agent is acquiring and processing data (where the acquisition may incur a certain cost), or a mobile sensor agent is moving in a (perhaps hostile) environment, collecting and processing sensor readings. In these settings, the tasks of the agent and the data mining are highly intertwined and interdependent (or even identical). Supervised learning is not a suitable technique for these cases. Reinforcement Learning (RL) enables an agent to learn from experience (in form of reward and punishment for explorative actions) and adapt to new situations, without a teacher. RL is an ideal learning technique for these data mining scenarios, because it fits the agent paradigm of continuous sensing and acting, and the RL agent is able to learn to make decisions on the sampling of the environment which provides the data. Nevertheless, RL still suffers from scalability problems, which have prevented its successful use in many complex real-world domains. The more complex the tasks, the longer it takes a reinforcement learning algorithm to converge to a good solution. For many real-world tasks, human expert knowledge is available. For example, human

  20. Application of machine learning methodology for pet-based definition of lung cancer

    Science.gov (United States)

    Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.

    2010-01-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802

  1. Constant Cutting Force Control for CNC Machining Using Dynamic Characteristic-Based Fuzzy Controller

    Directory of Open Access Journals (Sweden)

    Hengli Liu

    2015-01-01

    Full Text Available This paper presents a dynamic characteristic-based fuzzy adaptive control algorithm (DCbFACA to avoid the influence of cutting force changing rapidly on the machining stability and precision. The cutting force is indirectly obtained in real time by monitoring and extraction of the motorized spindle current, the feed speed is fuzzy adjusted online, and the current was used as a feedback to control cutting force and maintain the machining process stable. Different from the traditional fuzzy control methods using the experience-based control rules, and according to the complex nonlinear characteristics of CNC machining, the power bond graph method is implemented to describe the dynamic characteristics of process, and then the appropriate variation relations are achieved between current and feed speed, and the control rules are optimized and established based on it. The numerical results indicated that DCbFACA can make the CNC machining process more stable and improve the machining precision.

  2. Logically automorphically equivalent knowledge bases

    OpenAIRE

    Aladova, Elena; Plotkin, Tatjana

    2017-01-01

    Knowledge bases theory provide an important example of the field where applications of universal algebra and algebraic logic look very natural, and their interaction with practical problems arising in computer science might be very productive. In this paper we study the equivalence problem for knowledge bases. Our interest is to find out how the informational equivalence is related to the logical description of knowledge. Studying various equivalences of knowledge bases allows us to compare d...

  3. AC Loss Analysis of MgB2-Based Fully Superconducting Machines

    Science.gov (United States)

    Feddersen, M.; Haran, K. S.; Berg, F.

    2017-12-01

    Superconducting electric machines have shown potential for significant increase in power density, making them attractive for size and weight sensitive applications such as offshore wind generation, marine propulsion, and hybrid-electric aircraft propulsion. Superconductors exhibit no loss under dc conditions, though ac current and field produce considerable losses due to hysteresis, eddy currents, and coupling mechanisms. For this reason, many present machines are designed to be partially superconducting, meaning that the dc field components are superconducting while the ac armature coils are conventional conductors. Fully superconducting designs can provide increases in power density with significantly higher armature current; however, a good estimate of ac losses is required to determine the feasibility under the machines intended operating conditions. This paper aims to characterize the expected losses in a fully superconducting machine targeted towards aircraft, based on an actively-shielded, partially superconducting machine from prior work. Various factors are examined such as magnet strength, operating frequency, and machine load to produce a model for the loss in the superconducting components of the machine. This model is then used to optimize the design of the machine for minimal ac loss while maximizing power density. Important observations from the study are discussed.

  4. Sample-Based Extreme Learning Machine with Missing Data

    Directory of Open Access Journals (Sweden)

    Hang Gao

    2015-01-01

    Full Text Available Extreme learning machine (ELM has been extensively studied in machine learning community during the last few decades due to its high efficiency and the unification of classification, regression, and so forth. Though bearing such merits, existing ELM algorithms cannot efficiently handle the issue of missing data, which is relatively common in practical applications. The problem of missing data is commonly handled by imputation (i.e., replacing missing values with substituted values according to available information. However, imputation methods are not always effective. In this paper, we propose a sample-based learning framework to address this issue. Based on this framework, we develop two sample-based ELM algorithms for classification and regression, respectively. Comprehensive experiments have been conducted in synthetic data sets, UCI benchmark data sets, and a real world fingerprint image data set. As indicated, without introducing extra computational complexity, the proposed algorithms do more accurate and stable learning than other state-of-the-art ones, especially in the case of higher missing ratio.

  5. Climate risk index for Italy

    Science.gov (United States)

    Mysiak, Jaroslav; Torresan, Silvia; Bosello, Francesco; Mistry, Malcolm; Amadio, Mattia; Marzi, Sepehr; Furlan, Elisa; Sperotto, Anna

    2018-06-01

    We describe a climate risk index that has been developed to inform national climate adaptation planning in Italy and that is further elaborated in this paper. The index supports national authorities in designing adaptation policies and plans, guides the initial problem formulation phase, and identifies administrative areas with higher propensity to being adversely affected by climate change. The index combines (i) climate change-amplified hazards; (ii) high-resolution indicators of exposure of chosen economic, social, natural and built- or manufactured capital (MC) assets and (iii) vulnerability, which comprises both present sensitivity to climate-induced hazards and adaptive capacity. We use standardized anomalies of selected extreme climate indices derived from high-resolution regional climate model simulations of the EURO-CORDEX initiative as proxies of climate change-altered weather and climate-related hazards. The exposure and sensitivity assessment is based on indicators of manufactured, natural, social and economic capital assets exposed to and adversely affected by climate-related hazards. The MC refers to material goods or fixed assets which support the production process (e.g. industrial machines and buildings); Natural Capital comprises natural resources and processes (renewable and non-renewable) producing goods and services for well-being; Social Capital (SC) addressed factors at the individual (people's health, knowledge, skills) and collective (institutional) level (e.g. families, communities, organizations and schools); and Economic Capital (EC) includes owned and traded goods and services. The results of the climate risk analysis are used to rank the subnational administrative and statistical units according to the climate risk challenges, and possibly for financial resource allocation for climate adaptation. This article is part of the theme issue `Advances in risk assessment for climate change adaptation policy'.

  6. Investigation on Wire Electrochemical Micro Machining of Ni-based Metallic Glass

    International Nuclear Information System (INIS)

    Meng, Lingchao; Zeng, Yongbin; Zhu, Di

    2017-01-01

    Highlights: • WECMM with nanosecond pulses is proposed firstly for fabricating micro complex components based on metallic glasses. • Applicable electrolyte for WECMM of the Ni-based MG is discussed. • Significantly uniform machined surface is achieved in H_2SO_4 solution. • High machining efficiency and stability are obtained experimentally by modifying pulse waveforms and electrolyte compositions. • Complex microstructures of Ni-based MG are fabricated by WECMM with optimized parameters. - Abstract: Metallic glasses (MGs) have been recognized as promising materials for realizing high-performance micro devices in micro electromechanical systems (MEMS) due to their excellent functional and structural characteristics. However, the applications of MGs are currently limited because of the difficulty of shaping them on the microscale. Wire electrochemical micro machining (WECMM) is increasingly recognized as a flexible and effective method to fabricate complex-shaped micro metal components with many advantages relative to the thermomechanical processing, which appears to be well suitable for micro shaping of MGs. We consider the example of a Ni-based MG, Ni_7_2Cr_1_9Si_7B_2, which has a typical passivation characteristic in 0.1 M H_2SO_4 solution. The transpassive process can be used for localized material removal when combined with nanosecond pulsed WECMM technique. In present work, the applicable electrolyte for WECMM of the Ni-based MG was discussed firstly. Then the voltage pulse waveform and electrolyte composition were modified to improve machining efficiency and stability. Several complex microstructures such as micro curved cantilever beam, micro gear, and micro square helix were machined with different optimized parameters.

  7. Extending the features of RBMK refuelling machine simulator with a training tool based on virtual reality

    International Nuclear Information System (INIS)

    Khoudiakov, M.; Slonimsky, V.; Mitrofanov, S.

    2004-01-01

    The paper describes a continuation of efforts of an international Russian - Norwegian joint team to improve operational safety during the refuelling process of an RBMK-type reactor by implementing a training simulator based on an innovative Virtual Reality (VR) approach. During the preceding 1st stage of the project a display-based simulator was extended with VR models of the real Refuelling Machine (RM) and its environment in order to improve both the learning process and operation's effectiveness. The simulator's challenge is to support the performance (operational activity) of RM operational staff firstly by helping them to develop basic knowledge and skills as well as to keep skilled staff in close touch with the complex machinery of the Refuelling Machine. During the 2nd stage of the joint project the functional scope of the VR-simulator was greatly enhanced - firstly, by connecting to the RBMK-unit full-scope simulator, and, secondly, by including a training program and simulator model upgrade. The present 3rd stage of the Project is primarily oriented towards the improvement of the training process for maintenance and operational personnel by means of a development of the Training Support Methodology and Courses (TSMC) to be based on Virtual Reality and enlarged functionality of 3D and process modelling. The TMSC development is based on Russian and International Regulatory Bodies requirements and recommendations. Design, development and creation of a specialised VR-based Training System for RM Maintenance Personnel are very important for the Russian RBMK plants. The main goal is to create a powerful, autonomous VR-based simulator for training technical maintenance personnel on the Refuelling Machine. VR based training is expected to improve the effect of training compared to the current training based on traditional methods using printed documentation. The LNPP management and the Regulatory Bodies supported this goal. The VR-based Training System should

  8. vSphere virtual machine management

    CERN Document Server

    Fitzhugh, Rebecca

    2014-01-01

    This book follows a step-by-step tutorial approach with some real-world scenarios that vSphere businesses will be required to overcome every day. This book also discusses creating and configuring virtual machines and also covers monitoring virtual machine performance and resource allocation options. This book is for VMware administrators who want to build their knowledge of virtual machine administration and configuration. It's assumed that you have some experience with virtualization administration and vSphere.

  9. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  10. ADAM: ADaptive Autonomous Machine

    NARCIS (Netherlands)

    van Oosten, Daan C.; Nijenhuis, Lucas F.J.; Bakkers, André; Vervoort, Wiek

    1996-01-01

    This paper describes a part of the development of an adaptive autonomous machine that is able to move in an unknown world extract knowledge out of the perceived data, has the possibility to reason, and finally has the capability to exchange experiences and knowledge with other agents. The agent is

  11. A user-based usability assessment of raw machine translated technical instructions

    OpenAIRE

    Doherty, Stephen; O'Brien, Sharon

    2012-01-01

    Despite the growth of statistical machine translation (SMT) research and development in recent years, it remains somewhat out of reach for the translation community where programming expertise and knowledge of statistics tend not to be commonplace. While the concept of SMT is relatively straightforward, its implementation in functioning systems remains difficult for most, regardless of expertise. More recently, however, developments such as SmartMATE have emerged which aim to assist users in ...

  12. Osteoporosis risk prediction for bone mineral density assessment of postmenopausal women using machine learning.

    Science.gov (United States)

    Yoo, Tae Keun; Kim, Sung Kean; Kim, Deok Won; Choi, Joon Yul; Lee, Wan Hyung; Oh, Ein; Park, Eun-Cheol

    2013-11-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women compared to the ability of conventional clinical decision tools. We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Examination Surveys. The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests, artificial neural networks (ANN), and logistic regression (LR) based on simple surveys. The machine learning models were compared to four conventional clinical decision tools: osteoporosis self-assessment tool (OST), osteoporosis risk assessment instrument (ORAI), simple calculated osteoporosis risk estimation (SCORE), and osteoporosis index of risk (OSIRIS). SVM had significantly better area under the curve (AUC) of the receiver operating characteristic than ANN, LR, OST, ORAI, SCORE, and OSIRIS for the training set. SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0% at total hip, femoral neck, or lumbar spine for the testing set. The significant factors selected by SVM were age, height, weight, body mass index, duration of menopause, duration of breast feeding, estrogen therapy, hyperlipidemia, hypertension, osteoarthritis, and diabetes mellitus. Considering various predictors associated with low bone density, the machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  13. A least square support vector machine-based approach for contingency classification and ranking in a large power system

    Directory of Open Access Journals (Sweden)

    Bhanu Pratap Soni

    2016-12-01

    Full Text Available This paper proposes an effective supervised learning approach for static security assessment of a large power system. Supervised learning approach employs least square support vector machine (LS-SVM to rank the contingencies and predict the system severity level. The severity of the contingency is measured by two scalar performance indices (PIs: line MVA performance index (PIMVA and Voltage-reactive power performance index (PIVQ. SVM works in two steps. Step I is the estimation of both standard indices (PIMVA and PIVQ that is carried out under different operating scenarios and Step II contingency ranking is carried out based on the values of PIs. The effectiveness of the proposed methodology is demonstrated on IEEE 39-bus (New England system. The approach can be beneficial tool which is less time consuming and accurate security assessment and contingency analysis at energy management center.

  14. Knowledge based maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Sturm, A [Hamburgische Electacitaets-Werke AG Hamburg (Germany)

    1998-12-31

    The establishment of maintenance strategies is of crucial significance for the reliability of a plant and the economic efficiency of maintenance measures. Knowledge about the condition of components and plants from the technical and business management point of view therefore becomes one of the fundamental questions and the key to efficient management and maintenance. A new way to determine the maintenance strategy can be called: Knowledge Based Maintenance. A simple method for determining strategies while taking the technical condition of the components of the production process into account to the greatest possible degree which can be shown. A software with an algorithm for Knowledge Based Maintenance leads the user during complex work to the determination of maintenance strategies for this complex plant components. (orig.)

  15. Knowledge based maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Sturm, A. [Hamburgische Electacitaets-Werke AG Hamburg (Germany)

    1997-12-31

    The establishment of maintenance strategies is of crucial significance for the reliability of a plant and the economic efficiency of maintenance measures. Knowledge about the condition of components and plants from the technical and business management point of view therefore becomes one of the fundamental questions and the key to efficient management and maintenance. A new way to determine the maintenance strategy can be called: Knowledge Based Maintenance. A simple method for determining strategies while taking the technical condition of the components of the production process into account to the greatest possible degree which can be shown. A software with an algorithm for Knowledge Based Maintenance leads the user during complex work to the determination of maintenance strategies for this complex plant components. (orig.)

  16. Design of an ARM-based Automatic Rice-Selling Machine for Cafeterias

    Directory of Open Access Journals (Sweden)

    Zhiliang Kang

    2016-02-01

    Full Text Available To address the problems of low selling efficiency, poor sanitation conditions, labor-intensive requirement, and quick rice cooling speed in manual rice selling in cafeterias, especially in colleges and secondary schools, this paper presented an Advanced RISC Machines (ARM microprocessor-based rice-selling machine for cafeterias. The machines consisted of a funnel-shaped rice bin, a thermal insulation box, and a conveying and scattering mechanism. Moreover, this machine exerts fuzzy control over stepper motor rpm, and the motor drives the conveyor belt with a scraper to scatter rice, deliver it, and keep it warm. Apart from an external 4*4 keyboard, a point of sale (POS machine, an ARM process and a pressure sensor, the machine is also equipped with card swiping and weighting mechanisms to achieve functions of card swiping payment and precise measurement, respectively. In addition, detection of the right amount of rice and the alarm function are achieved using an ultrasonic sensor and a beeper, respectively. The presence of the rice container on the rice outlet is detected by an optoelectronic switch. Results show that this rice-selling machine achieves precise measurement, quick card swiping, fast rice selling, stable operation, and good rice heat preservation. Therefore, the mechanical design enables the machine to achieve its goals.

  17. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  18. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  19. [Card-based age control mechanisms at tobacco vending machines. Effect and consequences].

    Science.gov (United States)

    Schneider, S; Meyer, C; Löber, S; Röhrig, S; Solle, D

    2010-02-01

    Until recently, 700,000 tobacco vending machines provided uncontrolled access to cigarettes for children and adolescents in Germany. On January 1, 2007, a card-based electronic locking device was attached to all tobacco vending machines to prevent the purchase of cigarettes by children and adolescents under 16. Starting in 2009, only persons older than 18 are able to buy cigarettes from tobacco vending machines. The aim of the present investigation (SToP Study: "Sources of Tobacco for Pupils" Study) was to assess changes in the number of tobacco vending machines after the introduction of these new technical devices (supplier's reaction). In addition, the ways smoking adolescents make purchases were assessed (consumer's reaction). We registered and mapped the total number of tobacco points of sale (tobacco POS) before and after the introduction of the card-based electronic locking device in two selected districts of the city of Cologne. Furthermore, pupils from local schools (response rate: 83%) were asked about their tobacco consumption and ways of purchase using a questionnaire. Results indicated that in the area investigated the total number of tobacco POSs decreased from 315 in 2005 to 277 in 2007. The rates of decrease were 48% for outdoor vending machines and 8% for indoor vending machines. Adolescents reported circumventing the card-based electronic locking devices (e.g., by using cards from older friends) and using other tobacco POSs (especially newspaper kiosks) or relying on their social network (mainly friends). The decreasing number of tobacco vending machines has not had a significant impact on cigarette acquisition by adolescent smokers as they tend to circumvent the newly introduced security measures.

  20. Distributed, cooperating knowledge-based systems

    Science.gov (United States)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  1. Indonesia knowledge dissemination: a snapshot

    Science.gov (United States)

    Nasution, M. K. M.

    2018-03-01

    The educational progress of a country or educational institution is measured through the implementation of knowledge dissemination. Evidence of knowledge dissemination has carried out be in form of the type of published document, which is based on the databases of the index of scientific publications: Scopus. This paper expresses a simple form of knowledge dissemination based on document type. Although the growth of knowledge dissemination does not have the same pattern based on the appearance of document types, the general implementation is almost the same. However, maximum effort needs to be done by PTN-bh to support Indonesia knowledge dissemination.

  2. Hybrid machining processes perspectives on machining and finishing

    CERN Document Server

    Gupta, Kapil; Laubscher, R F

    2016-01-01

    This book describes various hybrid machining and finishing processes. It gives a critical review of the past work based on them as well as the current trends and research directions. For each hybrid machining process presented, the authors list the method of material removal, machining system, process variables and applications. This book provides a deep understanding of the need, application and mechanism of hybrid machining processes.

  3. A reliability-based preventive maintenance methodology for the projection spot welding machine

    Directory of Open Access Journals (Sweden)

    Fayzimatov Ulugbek

    2018-06-01

    Full Text Available An effective operations of a projection spot welding (PSW machine is closely related to the effec-tiveness of the maintenance. Timely maintenance can prevent failures and improve reliability and maintainability of the machine. Therefore, establishing the maintenance frequency for the welding machine is one of the most important tasks for plant engineers. In this regard, reliability analysis of the welding machine can be used to establish preventive maintenance intervals (PMI and to identify the critical parts of the system. In this reliability and maintainability study, analysis of the PSW machine was carried out. The failure and repair data for analysis were obtained from automobile manufacturing company located in Uzbekistan. The machine was divided into three main sub-systems: electrical, pneumatic and hydraulic. Different distributions functions for all sub-systems was tested and their parameters tabulated. Based on estimated parameters of the analyzed distribu-tions, PMI for the PSW machines sub-systems at different reliability levels was calculated. Finally, preventive measures for enhancing the reliability of the PSW machine sub-systems are suggested.

  4. CrN-based wear resistant hard coatings for machining and forming tools

    Energy Technology Data Exchange (ETDEWEB)

    Yang, S; Cooke, K E; Teer, D G [Teer Coatings Ltd, West Stone House, Berry Hill Industrial Estate, Droitwich, Worcestershire WR9 9AS (United Kingdom); Li, X [School of Metallurgy and Materials, University of Birmingham, Birmingham B15 2TT (United Kingdom); McIntosh, F [Rolls-Royce plc, Inchinnan, Renfrewshire PA4 9AF, Scotland (United Kingdom)

    2009-05-21

    Highly wear resistant multicomponent or multilayer hard coatings, based on CrN but incorporating other metals, have been developed using closed field unbalanced magnetron sputter ion plating technology. They are exploited in coated machining and forming tools cutting and forming of a wide range of materials in various application environments. These coatings are characterized by desirable properties including good adhesion, high hardness, high toughness, high wear resistance, high thermal stability and high machining capability for steel. The coatings appear to show almost universal working characteristics under operating conditions of low and high temperature, low and high machining speed, machining of ordinary materials and difficult to machine materials, and machining under lubricated and under minimum lubricant quantity or even dry conditions. These coatings can be used for cutting and for forming tools, for conventional (macro-) machining tools as well as for micromachining tools, either as a single coating or in combination with an advanced, self-lubricating topcoat.

  5. Optimizing block-based maintenance under random machine usage

    NARCIS (Netherlands)

    de Jonge, Bram; Jakobsons, Edgars

    Existing studies on maintenance optimization generally assume that machines are either used continuously, or that times until failure do not depend on the actual usage. In practice, however, these assumptions are often not realistic. In this paper, we consider block-based maintenance optimization

  6. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  7. Multiphysics simulation by design for electrical machines, power electronics and drives

    CERN Document Server

    Rosu, Marius; Lin, Dingsheng; Ionel, Dan M; Popescu, Mircea; Blaabjerg, Frede; Rallabandi, Vandana; Staton, David

    2018-01-01

    This book combines the knowledge of experts from both academia and the software industry to present theories of multiphysics simulation by design for electrical machines, power electronics, and drives. The comprehensive design approach described within supports new applications required by technologies sustaining high drive efficiency. The highlighted framework considers the electric machine at the heart of the entire electric drive. The book also emphasizes the simulation by design concept--a concept that frames the entire highlighted design methodology, which is described and illustrated by various advanced simulation technologies. Multiphysics Simulation by Design for Electrical Machines, Power Electronics and Drives begins with the basics of electrical machine design and manufacturing tolerances. It also discusses fundamental aspects of the state of the art design process and includes examples from industrial practice. It explains FEM-based analysis techniques for electrical machine design--providing deta...

  8. Effect of heat treatments on machinability of gold alloy with age-hardenability at intraoral temperature.

    Science.gov (United States)

    Watanabe, I; Baba, N; Watanabe, E; Atsuta, M; Okabe, T

    2004-01-01

    This study investigated the effect of heat treatment on the machinability of heat-treated cast gold alloy with age-hardenability at intraoral temperature using a handpiece engine with SiC wheels and an air-turbine handpiece with carbide burs and diamond points. Cast gold alloy specimens underwent various heat treatments [As-cast (AC); Solution treatment (ST); High-temperature aging (HA), Intraoral aging (IA)] before machinability testing. The machinability test was conducted at a constant machining force of 0.784N. The three circumferential speeds used for the handpiece engine were 500, 1,000 and 1,500 m/min. The machinability index (M-index) was determined as the amount of metal removed by machining (volume loss, mm(3)). The results were analyzed by ANOVA and Scheffé's test. When an air-turbine handpiece was used, there was no difference in the M-index of the gold alloy among the heat treatments. The air-turbine carbide burs showed significantly (pmachinability of the gold alloy using the air-turbine handpiece. The heat treatments had a small effect on the M-index of the gold alloy machined with a SiC wheel for a handpiece engine.

  9. Community-based knowledge translation: unexplored opportunities

    Directory of Open Access Journals (Sweden)

    Armstrong Rebecca

    2011-06-01

    Full Text Available Abstract Background Knowledge translation is an interactive process of knowledge exchange between health researchers and knowledge users. Given that the health system is broad in scope, it is important to reflect on how definitions and applications of knowledge translation might differ by setting and focus. Community-based organizations and their practitioners share common characteristics related to their setting, the evidence used in this setting, and anticipated outcomes that are not, in our experience, satisfactorily reflected in current knowledge translation approaches, frameworks, or tools. Discussion Community-based organizations face a distinctive set of challenges and concerns related to engaging in the knowledge translation process, suggesting a unique perspective on knowledge translation in these settings. Specifically, community-based organizations tend to value the process of working in collaboration with multi-sector stakeholders in order to achieve an outcome. A feature of such community-based collaborations is the way in which 'evidence' is conceptualized or defined by these partners, which may in turn influence the degree to which generalizable research evidence in particular is relevant and useful when balanced against more contextually-informed knowledge, such as tacit knowledge. Related to the issues of evidence and context is the desire for local information. For knowledge translation researchers, developing processes to assist community-based organizations to adapt research findings to local circumstances may be the most helpful way to advance decision making in this area. A final characteristic shared by community-based organizations is involvement in advocacy activities, a function that has been virtually ignored in traditional knowledge translation approaches. Summary This commentary is intended to stimulate further discussion in the area of community-based knowledge translation. Knowledge translation, and exchange

  10. Runtime Optimizations for Tree-Based Machine Learning Models

    NARCIS (Netherlands)

    N. Asadi; J.J.P. Lin (Jimmy); A.P. de Vries (Arjen)

    2014-01-01

    htmlabstractTree-based models have proven to be an effective solution for web ranking as well as other machine learning problems in diverse domains. This paper focuses on optimizing the runtime performance of applying such models to make predictions, specifically using gradient-boosted regression

  11. Short-term traffic flow prediction model using particle swarm optimization–based combined kernel function-least squares support vector machine combined with chaos theory

    Directory of Open Access Journals (Sweden)

    Qiang Shang

    2016-08-01

    Full Text Available Short-term traffic flow prediction is an important part of intelligent transportation systems research and applications. For further improving the accuracy of short-time traffic flow prediction, a novel hybrid prediction model (multivariate phase space reconstruction–combined kernel function-least squares support vector machine based on multivariate phase space reconstruction and combined kernel function-least squares support vector machine is proposed. The C-C method is used to determine the optimal time delay and the optimal embedding dimension of traffic variables’ (flow, speed, and occupancy time series for phase space reconstruction. The G-P method is selected to calculate the correlation dimension of attractor which is an important index for judging chaotic characteristics of the traffic variables’ series. The optimal input form of combined kernel function-least squares support vector machine model is determined by multivariate phase space reconstruction, and the model’s parameters are optimized by particle swarm optimization algorithm. Finally, case validation is carried out using the measured data of an expressway in Xiamen, China. The experimental results suggest that the new proposed model yields better predictions compared with similar models (combined kernel function-least squares support vector machine, multivariate phase space reconstruction–generalized kernel function-least squares support vector machine, and phase space reconstruction–combined kernel function-least squares support vector machine, which indicates that the new proposed model exhibits stronger prediction ability and robustness.

  12. Self-perception and knowledge of evidence based medicine by physicians.

    Science.gov (United States)

    Aguirre-Raya, Karen A; Castilla-Peón, María F; Barajas-Nava, Leticia A; Torres-Rodríguez, Violeta; Muñoz-Hernández, Onofre; Garduño-Espinosa, Juan

    2016-06-29

    The influence, legitimacy and application of Evidence Based Medicine (EBM) in the world is growing as a tool that integrates, the best available evidence to decision making in patient care. Our goal was to identify the relationship between self-perception about the relevance of Evidence Based Medicine (EBM) and the degree of basic knowledge of this discipline in a group of physicians. A survey was carried out in a third level public hospital in Mexico City. Self-perception was measured by means of a structured scale, and the degree of knowledge through parameter or "rubrics" methodology. A total of 320 questionnaires were given to 55 medical students (17 %); 45 pre-graduate medical interns (14 %); 118 medical residents (37 %) and 102 appointed physicians of different specialties (32 %). Self-perception of EBM: The majority of those surveyed (n = 274, 86 %) declared that they were very or moderately familiar with EBM. The great majority (n = 270, 84 %) believe that EBM is very important in clinical practice and 197 physicians (61 %) said that they implement it always or usually. The global index of self-perception was 75 %. Knowledge of EBM: Definition of EBM; Seven of those surveyed (2 %) included 3 of the 4 characteristics of the definition, 82 (26 %) mentioned only two characteristics of the definition, 152 (48 %) mentioned only one characteristic and 79 (25 %) did not include any characteristic of EBM. Phases of the EBM process: The majority of those surveyed (n = 218, 68 %) did not include the steps that characterize the practice of EBM, of which 79 participants (25 %) mentioned elements not related to it. The global index of knowledge was 19 %. The majority of the surveyed physicians have a high self-perception of the relevance of EBM. In spite of this, the majority of them did not know the characteristics that define the EBM and phases of the process for its practice. A major discrepancy was found between self-perception and the

  13. Machine Translation Using Constraint-Based Synchronous Grammar

    Institute of Scientific and Technical Information of China (English)

    WONG Fai; DONG Mingchui; HU Dongcheng

    2006-01-01

    A synchronous grammar based on the formalism of context-free grammar was developed by generalizing the first component of production that models the source text. Unlike other synchronous grammars,the grammar allows multiple target productions to be associated to a single production rule which can be used to guide a parser to infer different possible translational equivalences for a recognized input string according to the feature constraints of symbols in the pattern. An extended generalized LR algorithm was adapted to the parsing of the proposed formalism to analyze the syntactic structure of a language. The grammar was used as the basis for building a machine translation system for Portuguese to Chinese translation. The empirical results show that the grammar is more expressive when modeling the translational equivalences of parallel texts for machine translation and grammar rewriting applications.

  14. Knowledge-based software design for Defense-in-Depth risk monitor system and application for AP1000

    International Nuclear Information System (INIS)

    Ma Zhanguo; Yoshikawa, Hidekazu; Yang Ming; Nakagawa, Takashi

    2017-01-01

    As part of the new risk monitor system, the software for the plant Defense-in-Depth (DiD) risk monitor system was designed based on the state-transition and finite-state machine, and then the knowledge-based software was developed by object-oriented method utilizing the Unified Modeling Language (UML). Currently, there are mainly two functions in the developed plant DiD risk monitor software that are knowledge-base editor which is used to model the system in a hierarchical manner and the interaction simulator that simulates the interactions between the different actors in the model. In this paper, a model for playing its behavior is called an Actor which is modeled at the top level. The passive safety AP1000 power plant was studied and the small-break loss-of-coolant accident (SBLOCA) design basis accident transient is modeled using the plant DiD risk monitor software. Furthermore, the simulation result is shown for the interactions between the actors which are defined in the plant DiD risk monitor system as PLANT actor, OPERATOR actor, and SUPERVISOR actor. This paper shows that it is feasible to model the nuclear power plant knowledge base using the software modeling technique. The software can make the large knowledge base for the nuclear power plant with small effort. (author)

  15. Timer-based data acquisitioning of creep testing machines

    International Nuclear Information System (INIS)

    Rana, M.A.; Farooq, M.A.; Ali, L.

    1998-01-01

    Duration of a creep test may be short or long term extending over several years. Continuous operation of a computer for automatic data acquisition of creep testing machines is useless. Timer based data acquisitioning of the machines already interface with IBM-Pc/AT and compatibles has been streamlined for economical use of the computer. A locally designed and fabricated timer has been introduced in the system in this regard to meet the requirements of the system. The timer switches on the computer according to pre scheduled interval of time of capture creep data in Real time. The periodically captured data is logged on the hard disk for analysis and report generation. (author)

  16. Clone-based Data Index in Cloud Storage Systems

    Directory of Open Access Journals (Sweden)

    He Jing

    2016-01-01

    Full Text Available The storage systems have been challenged by the development of cloud computing. The traditional data index cannot satisfy the requirements of cloud computing because of the huge index volumes and quick response time. Meanwhile, because of the increasing size of data index and its dynamic characteristics, the previous ways, which rebuilding the index or fully backup the index before the data has changed, cannot satisfy the need of today’s big data index. To solve these problems, we propose a double-layer index structure that overcomes the throughput limitation of single point server. Then, a clone based B+ tree structure is proposed to achieve high performance and adapt dynamic environment. The experimental results show that our clone-based solution has high efficiency.

  17. High-precision diode-laser-based temperature measurement for air refractive index compensation

    International Nuclear Information System (INIS)

    Hieta, Tuomas; Merimaa, Mikko; Vainio, Markku; Seppae, Jeremias; Lassila, Antti

    2011-01-01

    We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements, it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlen equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilizes direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well-matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially nonhomogeneous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using a 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement.

  18. Support vector machine in machine condition monitoring and fault diagnosis

    Science.gov (United States)

    Widodo, Achmad; Yang, Bo-Suk

    2007-08-01

    Recently, the issue of machine condition monitoring and fault diagnosis as a part of maintenance system became global due to the potential advantages to be gained from reduced maintenance costs, improved productivity and increased machine availability. This paper presents a survey of machine condition monitoring and fault diagnosis using support vector machine (SVM). It attempts to summarize and review the recent research and developments of SVM in machine condition monitoring and diagnosis. Numerous methods have been developed based on intelligent systems such as artificial neural network, fuzzy expert system, condition-based reasoning, random forest, etc. However, the use of SVM for machine condition monitoring and fault diagnosis is still rare. SVM has excellent performance in generalization so it can produce high accuracy in classification for machine condition monitoring and diagnosis. Until 2006, the use of SVM in machine condition monitoring and fault diagnosis is tending to develop towards expertise orientation and problem-oriented domain. Finally, the ability to continually change and obtain a novel idea for machine condition monitoring and fault diagnosis using SVM will be future works.

  19. Big data analytics for early detection of breast cancer based on machine learning

    Science.gov (United States)

    Ivanova, Desislava

    2017-12-01

    This paper presents the concept and the modern advances in personalized medicine that rely on technology and review the existing tools for early detection of breast cancer. The breast cancer types and distribution worldwide is discussed. It is spent time to explain the importance of identifying the normality and to specify the main classes in breast cancer, benign or malignant. The main purpose of the paper is to propose a conceptual model for early detection of breast cancer based on machine learning for processing and analysis of medical big dataand further knowledge discovery for personalized treatment. The proposed conceptual model is realized by using Naive Bayes classifier. The software is written in python programming language and for the experiments the Wisconsin breast cancer database is used. Finally, the experimental results are presented and discussed.

  20. Applications of artificial intelligence 1993: Knowledge-based systems in aerospace and industry; Proceedings of the Meeting, Orlando, FL, Apr. 13-15, 1993

    Science.gov (United States)

    Fayyad, Usama M. (Editor); Uthurusamy, Ramasamy (Editor)

    1993-01-01

    The present volume on applications of artificial intelligence with regard to knowledge-based systems in aerospace and industry discusses machine learning and clustering, expert systems and optimization techniques, monitoring and diagnosis, and automated design and expert systems. Attention is given to the integration of AI reasoning systems and hardware description languages, care-based reasoning, knowledge, retrieval, and training systems, and scheduling and planning. Topics addressed include the preprocessing of remotely sensed data for efficient analysis and classification, autonomous agents as air combat simulation adversaries, intelligent data presentation for real-time spacecraft monitoring, and an integrated reasoner for diagnosis in satellite control. Also discussed are a knowledge-based system for the design of heat exchangers, reuse of design information for model-based diagnosis, automatic compilation of expert systems, and a case-based approach to handling aircraft malfunctions.

  1. Machine Learning an algorithmic perspective

    CERN Document Server

    Marsland, Stephen

    2009-01-01

    Traditional books on machine learning can be divided into two groups - those aimed at advanced undergraduates or early postgraduates with reasonable mathematical knowledge and those that are primers on how to code algorithms. The field is ready for a text that not only demonstrates how to use the algorithms that make up machine learning methods, but also provides the background needed to understand how and why these algorithms work. Machine Learning: An Algorithmic Perspective is that text.Theory Backed up by Practical ExamplesThe book covers neural networks, graphical models, reinforcement le

  2. Artificial emotional model based on finite state machine

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-mei; WU Wei-guo

    2008-01-01

    According to the basic emotional theory, the artificial emotional model based on the finite state machine(FSM) was presented. In finite state machine model of emotion, the emotional space included the basic emotional space and the multiple emotional spaces. The emotion-switching diagram was defined and transition function was developed using Markov chain and linear interpolation algorithm. The simulation model was built using Stateflow toolbox and Simulink toolbox based on the Matlab platform.And the model included three subsystems: the input one, the emotion one and the behavior one. In the emotional subsystem, the responses of different personalities to the external stimuli were described by defining personal space. This model takes states from an emotional space and updates its state depending on its current state and a state of its input (also a state-emotion). The simulation model realizes the process of switching the emotion from the neutral state to other basic emotions. The simulation result is proved to correspond to emotion-switching law of human beings.

  3. Transit Station Congestion Index Research Based on Pedestrian Simulation and Gray Clustering Evaluation

    Directory of Open Access Journals (Sweden)

    Shu-wei Wang

    2013-01-01

    Full Text Available A congestion phenomenon in a transit station could lead to low transfer efficiency as well as a hidden danger. Effective management of congestion phenomenon shall help to reduce the efficiency decline and danger risk. However, due to the difficulty in acquiring microcosmic pedestrian density, existing researches lack quantitative indicators to reflect congestion degree. This paper aims to solve this problem. Firstly, platform, stair, transfer tunnel, auto fare collection (AFC machine, and security check machine were chosen as key traffic facilities through large amounts of field investigation. Key facilities could be used to reflect the passenger density of a whole station. Secondly, the pedestrian density change law of each key traffic facility was analyzed using pedestrian simulation, and the load degree calculating method of each facility was defined, respectively, afterwards. Taking pedestrian density as basic data and gray clustering evaluation as algorithm, an index called Transit Station Congestion Index (TSCI was constructed to reflect the congestion degree of transit stations. Finally, an evaluation demonstration was carried out with five typical transit transfer stations in Beijing, and the evaluation results show that TSCI can objectively reflect the congestion degree of transit stations.

  4. The development of a novel knowledge-based weaning algorithm using pulmonary parameters: a simulation study.

    Science.gov (United States)

    Guler, Hasan; Kilic, Ugur

    2018-03-01

    Weaning is important for patients and clinicians who have to determine correct weaning time so that patients do not become addicted to the ventilator. There are already some predictors developed, such as the rapid shallow breathing index (RSBI), the pressure time index (PTI), and Jabour weaning index. Many important dimensions of weaning are sometimes ignored by these predictors. This is an attempt to develop a knowledge-based weaning process via fuzzy logic that eliminates the disadvantages of the present predictors. Sixteen vital parameters listed in published literature have been used to determine the weaning decisions in the developed system. Since there are considered to be too many individual parameters in it, related parameters were grouped together to determine acid-base balance, adequate oxygenation, adequate pulmonary function, hemodynamic stability, and the psychological status of the patients. To test the performance of the developed algorithm, 20 clinical scenarios were generated using Monte Carlo simulations and the Gaussian distribution method. The developed knowledge-based algorithm and RSBI predictor were applied to the generated scenarios. Finally, a clinician evaluated each clinical scenario independently. The Student's t test was used to show the statistical differences between the developed weaning algorithm, RSBI, and the clinician's evaluation. According to the results obtained, there were no statistical differences between the proposed methods and the clinician evaluations.

  5. Machine health prognostics using the Bayesian-inference-based probabilistic indication and high-order particle filtering framework

    Science.gov (United States)

    Yu, Jianbo

    2015-12-01

    Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.

  6. English to Sanskrit Machine Translation Using Transfer Based approach

    Science.gov (United States)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  7. Pre-use anesthesia machine check; certified anesthesia technician based quality improvement audit.

    Science.gov (United States)

    Al Suhaibani, Mazen; Al Malki, Assaf; Al Dosary, Saad; Al Barmawi, Hanan; Pogoku, Mahdhav

    2014-01-01

    Quality assurance of providing a work ready machine in multiple theatre operating rooms in a tertiary modern medical center in Riyadh. The aim of the following study is to keep high quality environment for workers and patients in surgical operating rooms. Technicians based audit by using key performance indicators to assure inspection, passing test of machine worthiness for use daily and in between cases and in case of unexpected failure to provide quick replacement by ready to use another anesthetic machine. The anesthetic machines in all operating rooms are daily and continuously inspected and passed as ready by technicians and verified by anesthesiologist consultant or assistant consultant. The daily records of each machines were collected then inspected for data analysis by quality improvement committee department for descriptive analysis and report the degree of staff compliance to daily inspection as "met" items. Replaced machine during use and overall compliance. Distractive statistic using Microsoft Excel 2003 tables and graphs of sums and percentages of item studied in this audit. Audit obtained highest compliance percentage and low rate of replacement of machine which indicate unexpected machine state of use and quick machine switch. The authors are able to conclude that following regular inspection and running self-check recommended by the manufacturers can contribute to abort any possibility of hazard of anesthesia machine failure during operation. Furthermore in case of unexpected reason to replace the anesthesia machine in quick maneuver contributes to high assured operative utilization of man machine inter-phase in modern surgical operating rooms.

  8. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    Science.gov (United States)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  9. Non-stationary signal analysis based on general parameterized time-frequency transform and its application in the feature extraction of a rotary machine

    Science.gov (United States)

    Zhou, Peng; Peng, Zhike; Chen, Shiqian; Yang, Yang; Zhang, Wenming

    2018-06-01

    With the development of large rotary machines for faster and more integrated performance, the condition monitoring and fault diagnosis for them are becoming more challenging. Since the time-frequency (TF) pattern of the vibration signal from the rotary machine often contains condition information and fault feature, the methods based on TF analysis have been widely-used to solve these two problems in the industrial community. This article introduces an effective non-stationary signal analysis method based on the general parameterized time-frequency transform (GPTFT). The GPTFT is achieved by inserting a rotation operator and a shift operator in the short-time Fourier transform. This method can produce a high-concentrated TF pattern with a general kernel. A multi-component instantaneous frequency (IF) extraction method is proposed based on it. The estimation for the IF of every component is accomplished by defining a spectrum concentration index (SCI). Moreover, such an IF estimation process is iteratively operated until all the components are extracted. The tests on three simulation examples and a real vibration signal demonstrate the effectiveness and superiority of our method.

  10. Knowledge Based Economy Assessment

    OpenAIRE

    Madalina Cristina Tocan

    2012-01-01

    The importance of knowledge-based economy (KBE) in the XXI century is evident. In the article the reflection of knowledge on economy is analyzed. The main point is targeted to the analysis of characteristics of knowledge expression in economy and to the construction of structure of KBE expression. This allows understanding the mechanism of functioning of knowledge economy. The authors highlight the possibility to assess the penetration level of KBE which could manifest itself trough the exist...

  11. A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis

    Science.gov (United States)

    Khawaja, Taimoor Saleem

    A high-belief low-overhead Prognostics and Health Management (PHM) system is desired for online real-time monitoring of complex non-linear systems operating in a complex (possibly non-Gaussian) noise environment. This thesis presents a Bayesian Least Squares Support Vector Machine (LS-SVM) based framework for fault diagnosis and failure prognosis in nonlinear non-Gaussian systems. The methodology assumes the availability of real-time process measurements, definition of a set of fault indicators and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. An efficient yet powerful Least Squares Support Vector Machine (LS-SVM) algorithm, set within a Bayesian Inference framework, not only allows for the development of real-time algorithms for diagnosis and prognosis but also provides a solid theoretical framework to address key concepts related to classification for diagnosis and regression modeling for prognosis. SVM machines are founded on the principle of Structural Risk Minimization (SRM) which tends to find a good trade-off between low empirical risk and small capacity. The key features in SVM are the use of non-linear kernels, the absence of local minima, the sparseness of the solution and the capacity control obtained by optimizing the margin. The Bayesian Inference framework linked with LS-SVMs allows a probabilistic interpretation of the results for diagnosis and prognosis. Additional levels of inference provide the much coveted features of adaptability and tunability of the modeling parameters. The two main modules considered in this research are fault diagnosis and failure prognosis. With the goal of designing an efficient and reliable fault diagnosis scheme, a novel Anomaly Detector is suggested based on the LS-SVM machines. The proposed scheme uses only baseline data to construct a 1-class LS-SVM machine which, when presented with online data is able to distinguish between normal behavior

  12. Knowledge-light adaptation approaches in case-based reasoning for radiotherapy treatment planning.

    Science.gov (United States)

    Petrovic, Sanja; Khussainova, Gulmira; Jagannathan, Rupa

    2016-03-01

    Radiotherapy treatment planning aims at delivering a sufficient radiation dose to cancerous tumour cells while sparing healthy organs in the tumour-surrounding area. It is a time-consuming trial-and-error process that requires the expertise of a group of medical experts including oncologists and medical physicists and can take from 2 to 3h to a few days. Our objective is to improve the performance of our previously built case-based reasoning (CBR) system for brain tumour radiotherapy treatment planning. In this system, a treatment plan for a new patient is retrieved from a case base containing patient cases treated in the past and their treatment plans. However, this system does not perform any adaptation, which is needed to account for any difference between the new and retrieved cases. Generally, the adaptation phase is considered to be intrinsically knowledge-intensive and domain-dependent. Therefore, an adaptation often requires a large amount of domain-specific knowledge, which can be difficult to acquire and often is not readily available. In this study, we investigate approaches to adaptation that do not require much domain knowledge, referred to as knowledge-light adaptation. We developed two adaptation approaches: adaptation based on machine-learning tools and adaptation-guided retrieval. They were used to adapt the beam number and beam angles suggested in the retrieved case. Two machine-learning tools, neural networks and naive Bayes classifier, were used in the adaptation to learn how the difference in attribute values between the retrieved and new cases affects the output of these two cases. The adaptation-guided retrieval takes into consideration not only the similarity between the new and retrieved cases, but also how to adapt the retrieved case. The research was carried out in collaboration with medical physicists at the Nottingham University Hospitals NHS Trust, City Hospital Campus, UK. All experiments were performed using real-world brain cancer

  13. Knowledge Base Editor (SharpKBE)

    Science.gov (United States)

    Tikidjian, Raffi; James, Mark; Mackey, Ryan

    2007-01-01

    The SharpKBE software provides a graphical user interface environment for domain experts to build and manage knowledge base systems. Knowledge bases can be exported/translated to various target languages automatically, including customizable target languages.

  14. Understanding images using knowledge based approach

    International Nuclear Information System (INIS)

    Tascini, G.

    1985-01-01

    This paper presents an approach to image understanding focusing on low level image processing and proposes a rule-based approach as part of larger knowledge-based system. The general system has a yerarchical structure that comprises several knowledge-based layers. The main idea is to confine at the lower level the domain independent knowledge and to reserve the higher levels for the domain dependent knowledge, that is for the interpretation

  15. Studies on learning by detecting impasse and by resulting it for building large scale knowledge base for autonomous plant

    International Nuclear Information System (INIS)

    Sawaragi, Tetsuo

    1997-03-01

    The acquisition of knowledge from human experts in an exhaustive way is extremely difficult, and even if it were possible, the maintenance of such a large knowledge base for realtime operation is not an easy task. The autonomous system having just incomplete knowledge would face with so many problems that contradicts with the system's current beliefs and/or are novel or unknown to the system. Experienced humans can manage to do with such novelty due to their generalizing ability and analogical inference based on the repertoire of precedents, even if they with new problems. Moreover, through experiencing such breakdowns and impasse, they can acquire some novel knowledge by their proactive attempts to interpret a provided problem as well as by updating their beliefs and contents and organization of their prior knowledge. We call such a style of learning as impasse-driven learning, meaning that learning dose occur being motivated by facing with contradiction and impasse. The related studies concerning with such a style of leaning have been studied within a field of machine learning of artificial intelligence so far as well as within a cognitive science field. In this paper, we at first summarize an outline of machine learning methodologies, and then, we detail about the impasse-driven learning. We discuss that from two different perspective of learning, one is from deductive and analogical learning and the other one is from inductive conceptual learning (i.e., concept formation or generalization-based memory). The former mainly discuss about how the learning system updates its prior beliefs and knowledge so that it can explain away the current contradiction using some meta-cognition heuristics. The latter attempts to assimilate a contradicting problem into its prior memory structure by dynamically reorganizing a collection of the precedents. We present those methodologies, and finally we introduce a case study of concept formation for plant anomalies and its usage for

  16. PROPOSAL FOR AN ERGONOMIC CONFORMITY INDEX FOR EVALUATION OF HARVESTERS AND FORWARDERS

    Directory of Open Access Journals (Sweden)

    Felipe Leitão da Cunha Marzano

    2017-11-01

    Full Text Available ABSTRACT Context: In mechanized forestry work, the ergonomic conditions of the workplace affects operator's health, performance and productivity. Originality: A comparison of different forest machines becomes complex in case where it is required analysis of several ergonomic factors simultaneously. There are several methods of ergonomic analysis however, a more complete methodology that considers several ergonomic factors and produces an index that represent the ergonomic condition of the machine is needed. Objective: Propose a methodology to determining an Ergonomic Conformity Index to evaluate Harvesters and Forwarders of different brands. Methodology: The ECI was determined initially basing in four relevant ergonomic factors: noise, vibration, thermal environment and air quality. These factors were evaluated utilizing four Harvesters and two Forwarders in eucalyptus timber harvesting operations. For each factor, a score was given according to its compliance with the established parameters. The ECI was obtained from an average of the scores given to each factor. The index ranges from zero to one, so that lower values indicate worse ergonomic conditions. Results: All the analyzed machines had continuous noise between 75.0 and 82.6 dB (A and whole-body vibration between 0.27 and 0.70 m s-2. HV1 and HV2 presented thermal environment in accordance with the established criteria, other machines showed deficiencies in this regard. All the machines presented non-conformities in the air quality, except HV2. The ECI of Harvesters HV1, HV2, HV3 and HV4, were 0.83; 0.88; 0.71; 0.63. The ECI of Forwarders FW1 and FW2 were 0.58 and 0.79. Conclusion: The determination of the ECI allowed an evaluation and comparison between analyzed forest machines. The machine with the higher ECI had only one non-conformity, and it was related to noise inside the cab. The machine that got the lower ECI presented non-conformities for all the factors.

  17. The role of soft computing in intelligent machines.

    Science.gov (United States)

    de Silva, Clarence W

    2003-08-15

    An intelligent machine relies on computational intelligence in generating its intelligent behaviour. This requires a knowledge system in which representation and processing of knowledge are central functions. Approximation is a 'soft' concept, and the capability to approximate for the purposes of comparison, pattern recognition, reasoning, and decision making is a manifestation of intelligence. This paper examines the use of soft computing in intelligent machines. Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human. This paper explores several important characteristics and capabilities of machines that exhibit intelligent behaviour. Approaches that are useful in the development of an intelligent machine are introduced. The paper presents a general structure for an intelligent machine, giving particular emphasis to its primary components, such as sensors, actuators, controllers, and the communication backbone, and their interaction. The role of soft computing within the overall system is discussed. Common techniques and approaches that will be useful in the development of an intelligent machine are introduced, and the main steps in the development of an intelligent machine for practical use are given. An industrial machine, which employs the concepts of soft computing in its operation, is presented, and one aspect of intelligent tuning, which is incorporated into the machine, is illustrated.

  18. Machine learning-based dual-energy CT parametric mapping.

    Science.gov (United States)

    Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Al Helo, Rose; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C; Rassouli, Negin; Gilkeson, Robert C; Traughber, Bryan J; Cheng, Chee-Wai; Muzic, Raymond F

    2018-05-22

    The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Zeff), relative electron density (ρe), mean excitation energy (Ix), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 seconds. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency. . © 2018 Institute of Physics and Engineering in

  19. International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines

    CERN Document Server

    Belyaev, Alexander; Krommer, Michael

    2017-01-01

    The papers in this volume present and discuss the frontiers in the mechanics of controlled machines and structures. They are based on papers presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines held in Vienna in September 2015. The workshop continues a series of international workshops held in Linz (2008) and St. Petersburg (2010).

  20. Analysis of Russia's biofuel knowledge base: A comparison with Germany and China

    International Nuclear Information System (INIS)

    Kang, Jin-Su; Kholod, Tetyana; Downing, Stephen

    2015-01-01

    This study assesses the evolutionary trajectory of the knowledge base of Russian biofuel technology compared to that of Germany, one of the successful leaders in adopting renewable energy, and China, an aggressive latecomer at promoting renewable energy. A total of 1797 patents filed in Russia, 8282 in Germany and 20,549 in China were retrieved from the European Patent Office database through 2012. We identify four collectively representative measures of a knowledge base (size, growth, cumulativeness, and interdependence), which are observable from biofuel patent citations. Furthermore, we define the exploratory–exploitative index, which enables us to identify the nature of learning embedded in the knowledge base structure. Our citation network analysis of the biofuel knowledge base trajectory by country, in conjunction with policy milestones, shows that Russia's biofuel knowledge base lacks both the increasing technological specialization of that in Germany and the accelerated growth rate of that in China. The German biofuel citation network shows a well-established knowledge base with increasing connectivity, while China's has grown exceptionally fast but with a sparseness of citations reflecting limited connections to preceding, foundational technologies. We conclude by addressing policy implications as well as limitations of the study and potential topics to explore in future research. -- Highlights: •Biofuel knowledge base (KB) of Russia is compared to those of Germany and China. •Citations network analysis measures KB size, growth, cumulativeness, and interdependence. •Russian KB lacks the increasing technological specialization of German KB. •Russia KB lacks the accelerated growth rate of Chinese KB. •Russia KB evolution reflects the poor institutional framework

  1. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  2. Trip Travel Time Forecasting Based on Selective Forgetting Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Zhiming Gui

    2014-01-01

    Full Text Available Travel time estimation on road networks is a valuable traffic metric. In this paper, we propose a machine learning based method for trip travel time estimation in road networks. The method uses the historical trip information extracted from taxis trace data as the training data. An optimized online sequential extreme machine, selective forgetting extreme learning machine, is adopted to make the prediction. Its selective forgetting learning ability enables the prediction algorithm to adapt to trip conditions changes well. Experimental results using real-life taxis trace data show that the forecasting model provides an effective and practical way for the travel time forecasting.

  3. Machine Translation for Academic Purposes

    Science.gov (United States)

    Lin, Grace Hui-chin; Chien, Paul Shih Chieh

    2009-01-01

    Due to the globalization trend and knowledge boost in the second millennium, multi-lingual translation has become a noteworthy issue. For the purposes of learning knowledge in academic fields, Machine Translation (MT) should be noticed not only academically but also practically. MT should be informed to the translating learners because it is a…

  4. Integrated human-machine intelligence in space systems

    Science.gov (United States)

    Boy, Guy A.

    1992-01-01

    The integration of human and machine intelligence in space systems is outlined with respect to the contributions of artificial intelligence. The current state-of-the-art in intelligent assistant systems (IASs) is reviewed, and the requirements of some real-world applications of the technologies are discussed. A concept of integrated human-machine intelligence is examined in the contexts of: (1) interactive systems that tolerate human errors; (2) systems for the relief of workloads; and (3) interactive systems for solving problems in abnormal situations. Key issues in the development of IASs include the compatibility of the systems with astronauts in terms of inputs/outputs, processing, real-time AI, and knowledge-based system validation. Real-world applications are suggested such as the diagnosis, planning, and control of enginnered systems.

  5. Molecular active plasmonics: controlling plasmon resonances with molecular machines

    KAUST Repository

    Zheng, Yue Bing

    2009-08-26

    The paper studies the molecular-level active control of localized surface plasmon resonances (LSPRs) of Au nanodisk arrays with molecular machines. Two types of molecular machines - azobenzene and rotaxane - have been demonstrated to enable the reversible tuning of the LSPRs via the controlled mechanical movements. Azobenzene molecules have the property of trans-cis photoisomerization and enable the photo-induced nematic (N)-isotropic (I) phase transition of the liquid crystals (LCs) that contain the molecules as dopant. The phase transition of the azobenzene-doped LCs causes the refractive-index difference of the LCs, resulting in the reversible peak shift of the LSPRs of the embedded Au nanodisks due to the sensitivity of the LSPRs to the disks\\' surroundings\\' refractive index. Au nanodisk array, coated with rotaxanes, switches its LSPRs reversibly when it is exposed to chemical oxidants and reductants alternatively. The correlation between the peak shift of the LSPRs and the chemically driven mechanical movement of rotaxanes is supported by control experiments and a time-dependent density functional theory (TDDFT)-based, microscopic model.

  6. Molecular active plasmonics: controlling plasmon resonances with molecular machines

    KAUST Repository

    Zheng, Yue Bing; Yang, Ying-Wei; Jensen, Lasse; Fang, Lei; Juluri, Bala Krishna; Flood, Amar H.; Weiss, Paul S.; Stoddart, J. Fraser; Huang, Tony Jun

    2009-01-01

    The paper studies the molecular-level active control of localized surface plasmon resonances (LSPRs) of Au nanodisk arrays with molecular machines. Two types of molecular machines - azobenzene and rotaxane - have been demonstrated to enable the reversible tuning of the LSPRs via the controlled mechanical movements. Azobenzene molecules have the property of trans-cis photoisomerization and enable the photo-induced nematic (N)-isotropic (I) phase transition of the liquid crystals (LCs) that contain the molecules as dopant. The phase transition of the azobenzene-doped LCs causes the refractive-index difference of the LCs, resulting in the reversible peak shift of the LSPRs of the embedded Au nanodisks due to the sensitivity of the LSPRs to the disks' surroundings' refractive index. Au nanodisk array, coated with rotaxanes, switches its LSPRs reversibly when it is exposed to chemical oxidants and reductants alternatively. The correlation between the peak shift of the LSPRs and the chemically driven mechanical movement of rotaxanes is supported by control experiments and a time-dependent density functional theory (TDDFT)-based, microscopic model.

  7. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    Science.gov (United States)

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  8. Automated knowledge-base refinement

    Science.gov (United States)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  9. Evidence of end-effector based gait machines in gait rehabilitation after CNS lesion.

    Science.gov (United States)

    Hesse, S; Schattat, N; Mehrholz, J; Werner, C

    2013-01-01

    A task-specific repetitive approach in gait rehabilitation after CNS lesion is well accepted nowadays. To ease the therapists' and patients' physical effort, the past two decades have seen the introduction of gait machines to intensify the amount of gait practice. Two principles have emerged, an exoskeleton- and an endeffector-based approach. Both systems share the harness and the body weight support. With the end-effector-based devices, the patients' feet are positioned on two foot plates, whose movements simulate stance and swing phase. This article provides an overview on the end-effector based machine's effectiveness regarding the restoration of gait. For the electromechanical gait trainer GT I, a meta analysis identified nine controlled trials (RCT) in stroke subjects (n = 568) and were analyzed to detect differences between end-effector-based locomotion + physiotherapy and physiotherapy alone. Patients practising with the machine effected in a superior gait ability (210 out of 319 patients, 65.8% vs. 96 out of 249 patients, 38.6%, respectively, Z = 2.29, p = 0.020), due to a larger training intensity. Only single RCTs have been reported for other devices and etiologies. The introduction of end-effector based gait machines has opened a new succesful chapter in gait rehabilitation after CNS lesion.

  10. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    Science.gov (United States)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  11. Earth-moving equipment as base machines in forest work. Final report of an NSR project

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Jerry [ed.

    1997-12-31

    Excavators have been used for forest draining for a long time in the Nordic countries. Only during the 1980s they were introduced as base machines for other forest operations, such as mounding, processing, harvesting, and road construction and road maintenance. Backhoe loaders were introduced in forestry at a somewhat later stage and to a smaller degree. The number of this type of base machines in forestry is so far small and is increasing very slowly. The NSR project `Earth moving equipment as base machines in forest work` started in 1993 and the project ended in 1995. The objective of the project was to obtain an overall picture of this type of machines up to a point where the logs are at landing site, ready for transportation to the industry. The project should cover as many aspects as possible. In order to obtain this picture, the main project was divided into sub projects. The sub projects separately described in this volume are (1) Excavators in ditching operations and site preparation, (2) Backhoe loaders in harvesting operations, (3) Excavators in wood cutting operations, (4) Tracked excavators in forestry operations, (5) Crawler versus wheeled base machines for single-grip harvester, and (6) Soil changes - A comparison between a wheeled and a tracked forest machine

  12. Predictive Power of Machine Learning for Optimizing Solar Water Heater Performance: The Potential Application of High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-01-01

    Full Text Available Predicting the performance of solar water heater (SWH is challenging due to the complexity of the system. Fortunately, knowledge-based machine learning can provide a fast and precise prediction method for SWH performance. With the predictive power of machine learning models, we can further solve a more challenging question: how to cost-effectively design a high-performance SWH? Here, we summarize our recent studies and propose a general framework of SWH design using a machine learning-based high-throughput screening (HTS method. Design of water-in-glass evacuated tube solar water heater (WGET-SWH is selected as a case study to show the potential application of machine learning-based HTS to the design and optimization of solar energy systems.

  13. XML-Based SHINE Knowledge Base Interchange Language

    Science.gov (United States)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  14. Current trends on knowledge-based systems

    CERN Document Server

    Valencia-García, Rafael

    2017-01-01

    This book presents innovative and high-quality research on the implementation of conceptual frameworks, strategies, techniques, methodologies, informatics platforms and models for developing advanced knowledge-based systems and their application in different fields, including Agriculture, Education, Automotive, Electrical Industry, Business Services, Food Manufacturing, Energy Services, Medicine and others. Knowledge-based technologies employ artificial intelligence methods to heuristically address problems that cannot be solved by means of formal techniques. These technologies draw on standard and novel approaches from various disciplines within Computer Science, including Knowledge Engineering, Natural Language Processing, Decision Support Systems, Artificial Intelligence, Databases, Software Engineering, etc. As a combination of different fields of Artificial Intelligence, the area of Knowledge-Based Systems applies knowledge representation, case-based reasoning, neural networks, Semantic Web and TICs used...

  15. Case-based reasoning: The marriage of knowledge base and data base

    Science.gov (United States)

    Pulaski, Kirt; Casadaban, Cyprian

    1988-01-01

    The coupling of data and knowledge has a synergistic effect when building an intelligent data base. The goal is to integrate the data and knowledge almost to the point of indistinguishability, permitting them to be used interchangeably. Examples given in this paper suggest that Case-Based Reasoning is a more integrated way to link data and knowledge than pure rule-based reasoning.

  16. All-to-All Communication on the Connection Machine CM-200

    Directory of Open Access Journals (Sweden)

    Kapil K. Mathur

    1995-01-01

    Full Text Available Detailed algorithms for all-to-all broadcast and reduction are given for arrays mapped by binary or binary-reflected Gray code encoding to the processing nodes of binary cube networks. Algorithms are also given for the local computation of the array indices for the communicated data, thereby reducing the demand for the communications bandwidth. For the Connection Machine system CM-200, Hamiltonian cycle-based all-to-all communication algorithms yield a performance that is a factor of 2 to 10 higher than the performance offered by algorithms based on trees, butterfly networks, or the Connection Machine router. The peak data rate achieved for all-to-all broadcast on a 2,048-node Connection Machine system CM-200 is 5.4 Gbyte/s. The index order of the data in local memory depends on implementation details of the algorithms, but it is well defined. If a linear ordering is desired, then including the time for local data reordering reduces the effective peak data rate to 2.5 Gbyte/s.

  17. Research on Key Technologies of Unit-Based CNC Machine Tool Assembly Design

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2014-01-01

    Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.

  18. Machine vision systems using machine learning for industrial product inspection

    Science.gov (United States)

    Lu, Yi; Chen, Tie Q.; Chen, Jie; Zhang, Jian; Tisler, Anthony

    2002-02-01

    Machine vision inspection requires efficient processing time and accurate results. In this paper, we present a machine vision inspection architecture, SMV (Smart Machine Vision). SMV decomposes a machine vision inspection problem into two stages, Learning Inspection Features (LIF), and On-Line Inspection (OLI). The LIF is designed to learn visual inspection features from design data and/or from inspection products. During the OLI stage, the inspection system uses the knowledge learnt by the LIF component to inspect the visual features of products. In this paper we will present two machine vision inspection systems developed under the SMV architecture for two different types of products, Printed Circuit Board (PCB) and Vacuum Florescent Displaying (VFD) boards. In the VFD board inspection system, the LIF component learns inspection features from a VFD board and its displaying patterns. In the PCB board inspection system, the LIF learns the inspection features from the CAD file of a PCB board. In both systems, the LIF component also incorporates interactive learning to make the inspection system more powerful and efficient. The VFD system has been deployed successfully in three different manufacturing companies and the PCB inspection system is the process of being deployed in a manufacturing plant.

  19. Bayesian networks modeling for thermal error of numerical control machine tools

    Institute of Scientific and Technical Information of China (English)

    Xin-hua YAO; Jian-zhong FU; Zi-chen CHEN

    2008-01-01

    The interaction between the heat source location,its intensity,thermal expansion coefficient,the machine system configuration and the running environment creates complex thermal behavior of a machine tool,and also makes thermal error prediction difficult.To address this issue,a novel prediction method for machine tool thermal error based on Bayesian networks (BNs) was presented.The method described causal relationships of factors inducing thermal deformation by graph theory and estimated the thermal error by Bayesian statistical techniques.Due to the effective combination of domain knowledge and sampled data,the BN method could adapt to the change of running state of machine,and obtain satisfactory prediction accuracy.Ex-periments on spindle thermal deformation were conducted to evaluate the modeling performance.Experimental results indicate that the BN method performs far better than the least squares(LS)analysis in terms of modeling estimation accuracy.

  20. QFD Based Benchmarking Logic Using TOPSIS and Suitability Index

    Directory of Open Access Journals (Sweden)

    Jaeho Cho

    2015-01-01

    Full Text Available Users’ satisfaction on quality is a key that leads successful completion of the project in relation to decision-making issues in building design solutions. This study proposed QFD (quality function deployment based benchmarking logic of market products for building envelope solutions. Benchmarking logic is composed of QFD-TOPSIS and QFD-SI. QFD-TOPSIS assessment model is able to evaluate users’ preferences on building envelope solutions that are distributed in the market and may allow quick achievement of knowledge. TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution provides performance improvement criteria that help defining users’ target performance criteria. SI (Suitability Index allows analysis on suitability of the building envelope solution based on users’ required performance criteria. In Stage 1 of the case study, QFD-TOPSIS was used to benchmark the performance criteria of market envelope products. In Stage 2, a QFD-SI assessment was performed after setting user performance targets. The results of this study contribute to confirming the feasibility of QFD based benchmarking in the field of Building Envelope Performance Assessment (BEPA.

  1. Body Mass Index, Nutrient Intakes, Health Behaviours and Nutrition Knowledge: A Quantile Regression Application in Taiwan

    Science.gov (United States)

    Chen, Shih-Neng; Tseng, Jauling

    2010-01-01

    Objective: To assess various marginal effects of nutrient intakes, health behaviours and nutrition knowledge on the entire distribution of body mass index (BMI) across individuals. Design: Quantitative and distributional study. Setting: Taiwan. Methods: This study applies Becker's (1965) model of health production to construct an individual's BMI…

  2. Utvärdering av Amazon Machine Learning för taggsystem

    OpenAIRE

    Madosh, Farzana; Lundsten, Erik

    2017-01-01

    How companies deal with machine learning is currently a highly-discussed topic, as it can facilitate corporate manual work by training computers to recognize patterns and thus automate the working procedure. However, this requires resources and knowledge in the field. As a result, various companies like Amazon and Google provide machine learning services without requiring the user to have deep knowledge in the area. This study evaluates Amazon Machine Learning program for a tag system with da...

  3. Validation Of Critical Knowledge-Based Systems

    Science.gov (United States)

    Duke, Eugene L.

    1992-01-01

    Report discusses approach to verification and validation of knowledge-based systems. Also known as "expert systems". Concerned mainly with development of methodologies for verification of knowledge-based systems critical to flight-research systems; e.g., fault-tolerant control systems for advanced aircraft. Subject matter also has relevance to knowledge-based systems controlling medical life-support equipment or commuter railroad systems.

  4. High-precision diode-laser-based temperature measurement for air refractive index compensation.

    Science.gov (United States)

    Hieta, Tuomas; Merimaa, Mikko; Vainio, Markku; Seppä, Jeremias; Lassila, Antti

    2011-11-01

    We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements, it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlén equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilizes direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well-matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially nonhomogeneous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using a 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement. © 2011 Optical Society of America

  5. Knowledge management: another management fad?

    Directory of Open Access Journals (Sweden)

    Leonard J. Ponzi

    2002-01-01

    Full Text Available Knowledge management is a subject of a growth body of literature. While capturing the interest of practitioners and scholars in the mid-1990s, knowledge management remains a broadly defined concept with faddish characteristics. Based on annual counts of article retrieved from Science Citation Index, Social Science Citation Index, and ABI Inform referring to three previous recognized management fad, this paper introduces empirical evidence that proposes that a typical management movement generally reveals itself as a fad in approximately five years. In applying this approach and assumption to the case of knowledge management, the findings suggest that knowledge management is at least living longer than typical fads and perhaps is in the process of establishing itself as a new aspect of management. To further the understanding of knowledge management's development, its interdisciplinary activity and breadth are reported and briefly discussed.

  6. Performance Evaluation of Machine Learning Methods for Leaf Area Index Retrieval from Time-Series MODIS Reflectance Data

    Science.gov (United States)

    Wang, Tongtong; Xiao, Zhiqiang; Liu, Zhigang

    2017-01-01

    Leaf area index (LAI) is an important biophysical parameter and the retrieval of LAI from remote sensing data is the only feasible method for generating LAI products at regional and global scales. However, most LAI retrieval methods use satellite observations at a specific time to retrieve LAI. Because of the impacts of clouds and aerosols, the LAI products generated by these methods are spatially incomplete and temporally discontinuous, and thus they cannot meet the needs of practical applications. To generate high-quality LAI products, four machine learning algorithms, including back-propagation neutral network (BPNN), radial basis function networks (RBFNs), general regression neutral networks (GRNNs), and multi-output support vector regression (MSVR) are proposed to retrieve LAI from time-series Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data in this study and performance of these machine learning algorithms is evaluated. The results demonstrated that GRNNs, RBFNs, and MSVR exhibited low sensitivity to training sample size, whereas BPNN had high sensitivity. The four algorithms performed slightly better with red, near infrared (NIR), and short wave infrared (SWIR) bands than red and NIR bands, and the results were significantly better than those obtained using single band reflectance data (red or NIR). Regardless of band composition, GRNNs performed better than the other three methods. Among the four algorithms, BPNN required the least training time, whereas MSVR needed the most for any sample size. PMID:28045443

  7. Knowledge-based-ness and synthesis of indigenous knowledge with climate in traditional architecture: Evidence from Naeen city

    Directory of Open Access Journals (Sweden)

    Ali Zangi Abadi

    2014-01-01

    Full Text Available Application of indigenous knowledge is an example of knowledge-based-ness considered as the main agenda in most current urban and climate planning as well as geographical studies. It is one of the interesting topics in research on the role of climate on human dwelling and immediate environment. Indeed, building ecology emphasizes the ability to combine climate and environmental factors and rendering them to spatial qualities and structural comfort. This paper investigates the application of indigenous knowledge and the correspondence between traditional texture of Naeen city and its climate. In this regard, the climatic conditions in Naeen were studied in terms of temperature, humidity, wind and tourist comfort index (TCI of climatourism. The data was collected from Isfahan Meteorology Website over the period 1985-2005. Subsequently, the traditional texture of Naeen, construction materials and building styles were studied through considering the special architectural conditions and compatible materials. Eventually, the correspondence between architectural styles and climatic conditions was studied. The results show that traditional architecture based on indigenous knowledge was consistent with climatic conditions. In this regard, using such materials as mud bricks and thatch with suitable heat capacity as well as using wind catchers, high walls around houses, dome construction, southward direction of houses were consistent with east-west wind direction. In addition, concentration of the traditional texture of the city, wall thickness, corridors, long hallways, courtyard ponds and roofed alleys are evidence of the effect of climatic conditions on urban texture in order to provide comfort in different seasons.

  8. Acquisition and understanding of process knowledge using problem solving methods

    CERN Document Server

    Gómez-Pérez, JM

    2010-01-01

    The development of knowledge-based systems is usually approached through the combined skills of knowledge engineers (KEs) and subject matter experts (SMEs). One of the most critical steps in this activity aims at transferring knowledge from SMEs to formal, machine-readable representations, which allow systems to reason with such knowledge. However, this is a costly and error prone task. Alleviating the knowledge acquisition bottleneck requires enabling SMEs with the means to produce the desired knowledge representations without the help of KEs. This is especially difficult in the case of compl

  9. Real-time application of knowledge-based systems

    Science.gov (United States)

    Brumbaugh, Randal W.; Duke, Eugene L.

    1989-01-01

    The Rapid Prototyping Facility (RPF) was developed to meet a need for a facility which allows flight systems concepts to be prototyped in a manner which allows for real-time flight test experience with a prototype system. This need was focused during the development and demonstration of the expert system flight status monitor (ESFSM). The ESFSM was a prototype system developed on a LISP machine, but lack of a method for progressive testing and problem identification led to an impractical system. The RPF concept was developed, and the ATMS designed to exercise its capabilities. The ATMS Phase 1 demonstration provided a practical vehicle for testing the RPF, as well as a useful tool. ATMS Phase 2 development continues. A dedicated F-18 is expected to be assigned for facility use in late 1988, with RAV modifications. A knowledge-based autopilot is being developed using the RPF. This is a system which provides elementary autopilot functions and is intended as a vehicle for testing expert system verification and validation methods. An expert system propulsion monitor is being prototyped. This system provides real-time assistance to an engineer monitoring a propulsion system during a flight.

  10. Application for vibration monitoring of aspheric surface machining based on wireless sensor networks

    Science.gov (United States)

    Han, Chun Guang; Guo, Yin Biao; Jiang, Chen

    2010-05-01

    Any kinds of tiny vibration of machine tool parts will have a great influence on surface quality of the workpiece at ultra-precise machining process of aspheric surface. At present the major way for decreasing influence of vibration is machining compensation technology. Therefore it is important for machining compensation control to acquire and transmit these vibration signals effectively. This paper presents a vibration monitoring system of aspheric surface machining machine tool based on wireless sensor networks (WSN). Some key issues of wireless sensor networks for vibration monitoring system of aspheric surface machining are discussed. The reliability of data transmission, network communication protocol and synchronization mechanism of wireless sensor networks are studied for the vibration monitoring system. The proposed system achieves multi-sensors vibration monitoring involving the grinding wheel, the workpiece and the workbench spindle. The wireless transmission of vibration signals is achieved by the combination with vibration sensor nodes and wireless network. In this paper, these vibration sensor nodes are developed. An experimental platform is structured which employs wireless sensor networks to the vibration monitoring system in order to test acquisition and wireless transmission of vibration signal. The test results show that the proposed system can achieve vibration data transmission effectively and reliability and meet the monitoring requirements of aspheric surface machining machine tool.

  11. The Abstract Machine Model for Transaction-based System Control

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.

    2003-01-31

    Recent work applying statistical mechanics to economic modeling has demonstrated the effectiveness of using thermodynamic theory to address the complexities of large scale economic systems. Transaction-based control systems depend on the conjecture that when control of thermodynamic systems is based on price-mediated strategies (e.g., auctions, markets), the optimal allocation of resources in a market-based control system results in an emergent optimal control of the thermodynamic system. This paper proposes an abstract machine model as the necessary precursor for demonstrating this conjecture and establishes the dynamic laws as the basis for a special theory of emergence applied to the global behavior and control of complex adaptive systems. The abstract machine in a large system amounts to the analog of a particle in thermodynamic theory. The permit the establishment of a theory dynamic control of complex system behavior based on statistical mechanics. Thus we may be better able to engineer a few simple control laws for a very small number of devices types, which when deployed in very large numbers and operated as a system of many interacting markets yields the stable and optimal control of the thermodynamic system.

  12. Exchanging Description Logic Knowledge Bases

    NARCIS (Netherlands)

    Arenas, M.; Botoeva, E.; Calvanese, D.; Ryzhikov, V.; Sherkhonov, E.

    2012-01-01

    In this paper, we study the problem of exchanging knowledge between a source and a target knowledge base (KB), connected through mappings. Differently from the traditional database exchange setting, which considers only the exchange of data, we are interested in exchanging implicit knowledge. As

  13. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics

    Directory of Open Access Journals (Sweden)

    Haejoon Jung

    2018-01-01

    Full Text Available As an intrinsic part of the Internet of Things (IoT ecosystem, machine-to-machine (M2M communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  14. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics.

    Science.gov (United States)

    Jung, Haejoon; Lee, In-Ho

    2018-01-12

    As an intrinsic part of the Internet of Things (IoT) ecosystem, machine-to-machine (M2M) communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave) communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC) device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs) with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy) of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  15. Region-Based Color Image Indexing and Retrieval

    DEFF Research Database (Denmark)

    Kompatsiaris, Ioannis; Triantafyllou, Evangelia; Strintzis, Michael G.

    2001-01-01

    In this paper a region-based color image indexing and retrieval algorithm is presented. As a basis for the indexing, a novel K-Means segmentation algorithm is used, modified so as to take into account the coherence of the regions. A new color distance is also defined for this algorithm. Based on ....... Experimental results demonstrate the performance of the algorithm. The development of an intelligent image content-based search engine for the World Wide Web is also presented, as a direct application of the presented algorithm....

  16. Querying Natural Logic Knowledge Bases

    DEFF Research Database (Denmark)

    Andreasen, Troels; Bulskov, Henrik; Jensen, Per Anker

    2017-01-01

    This paper describes the principles of a system applying natural logic as a knowledge base language. Natural logics are regimented fragments of natural language employing high level inference rules. We advocate the use of natural logic for knowledge bases dealing with querying of classes...... in ontologies and class-relationships such as are common in life-science descriptions. The paper adopts a version of natural logic with recursive restrictive clauses such as relative clauses and adnominal prepositional phrases. It includes passive as well as active voice sentences. We outline a prototype...... for partial translation of natural language into natural logic, featuring further querying and conceptual path finding in natural logic knowledge bases....

  17. Fairer machine learning in the real world: Mitigating discrimination without collecting sensitive data

    Directory of Open Access Journals (Sweden)

    Michael Veale

    2017-11-01

    Full Text Available Decisions based on algorithmic, machine learning models can be unfair, reproducing biases in historical data used to train them. While computational techniques are emerging to address aspects of these concerns through communities such as discrimination-aware data mining (DADM and fairness, accountability and transparency machine learning (FATML, their practical implementation faces real-world challenges. For legal, institutional or commercial reasons, organisations might not hold the data on sensitive attributes such as gender, ethnicity, sexuality or disability needed to diagnose and mitigate emergent indirect discrimination-by-proxy, such as redlining. Such organisations might also lack the knowledge and capacity to identify and manage fairness issues that are emergent properties of complex sociotechnical systems. This paper presents and discusses three potential approaches to deal with such knowledge and information deficits in the context of fairer machine learning. Trusted third parties could selectively store data necessary for performing discrimination discovery and incorporating fairness constraints into model-building in a privacy-preserving manner. Collaborative online platforms would allow diverse organisations to record, share and access contextual and experiential knowledge to promote fairness in machine learning systems. Finally, unsupervised learning and pedagogically interpretable algorithms might allow fairness hypotheses to be built for further selective testing and exploration. Real-world fairness challenges in machine learning are not abstract, constrained optimisation problems, but are institutionally and contextually grounded. Computational fairness tools are useful, but must be researched and developed in and with the messy contexts that will shape their deployment, rather than just for imagined situations. Not doing so risks real, near-term algorithmic harm.

  18. Prediction of Machine Tool Condition Using Support Vector Machine

    International Nuclear Information System (INIS)

    Wang Peigong; Meng Qingfeng; Zhao Jian; Li Junjie; Wang Xiufeng

    2011-01-01

    Condition monitoring and predicting of CNC machine tools are investigated in this paper. Considering the CNC machine tools are often small numbers of samples, a condition predicting method for CNC machine tools based on support vector machines (SVMs) is proposed, then one-step and multi-step condition prediction models are constructed. The support vector machines prediction models are used to predict the trends of working condition of a certain type of CNC worm wheel and gear grinding machine by applying sequence data of vibration signal, which is collected during machine processing. And the relationship between different eigenvalue in CNC vibration signal and machining quality is discussed. The test result shows that the trend of vibration signal Peak-to-peak value in surface normal direction is most relevant to the trend of surface roughness value. In trends prediction of working condition, support vector machine has higher prediction accuracy both in the short term ('One-step') and long term (multi-step) prediction compared to autoregressive (AR) model and the RBF neural network. Experimental results show that it is feasible to apply support vector machine to CNC machine tool condition prediction.

  19. A New Type of Tea Baking Machine Based on Pro/E Design

    Science.gov (United States)

    Lin, Xin-Ying; Wang, Wei

    2017-11-01

    In this paper, the production process of wulong tea was discussed, mainly the effect of baking on the quality of tea. The suitable baking temperature of different tea was introduced. Based on Pro/E, a new type of baking machine suitable for wulong tea baking was designed. The working principle, mechanical structure and constant temperature timing intelligent control system of baking machine were expounded. Finally, the characteristics and innovation of new baking machine were discussed.The mechanical structure of this baking machine is more simple and reasonable, and can use the heat of the inlet and outlet, more energy saving and environmental protection. The temperature control part adopts fuzzy PID control, which can improve the accuracy and response speed of temperature control and reduce the dependence of baking operation on skilled experience.

  20. Investigation of knowledge structure of nuclear data evaluation code

    International Nuclear Information System (INIS)

    Uenaka, Junji; Kambayashi, Shaw

    1988-08-01

    In this report, investigation results of knowledge structure in a nuclear data evaluation code are described. This investigation is related to the natural language processing and the knowledge base in the research theme of Human Acts Simulation Program (HASP) begun at the Computing Center of JAERI in 1987. By using a machine translation system, an attempt has been made to extract a deep knowledge from Japanese sentences which are equivalent to a FORTRAN program CASTHY for nuclear data evaluation. With the knowledge extraction method used by the authors, the verification of knowledge is more difficult than that of the prototyping method in an ordinary AI technique. In the early stage of building up a knowledge base system, it seems effective to extract and examine knowledge fragments of limited objects. (author)

  1. Medical Dataset Classification: A Machine Learning Paradigm Integrating Particle Swarm Optimization with Extreme Learning Machine Classifier

    Directory of Open Access Journals (Sweden)

    C. V. Subbulakshmi

    2015-01-01

    Full Text Available Medical data classification is a prime data mining problem being discussed about for a decade that has attracted several researchers around the world. Most classifiers are designed so as to learn from the data itself using a training process, because complete expert knowledge to determine classifier parameters is impracticable. This paper proposes a hybrid methodology based on machine learning paradigm. This paradigm integrates the successful exploration mechanism called self-regulated learning capability of the particle swarm optimization (PSO algorithm with the extreme learning machine (ELM classifier. As a recent off-line learning method, ELM is a single-hidden layer feedforward neural network (FFNN, proved to be an excellent classifier with large number of hidden layer neurons. In this research, PSO is used to determine the optimum set of parameters for the ELM, thus reducing the number of hidden layer neurons, and it further improves the network generalization performance. The proposed method is experimented on five benchmarked datasets of the UCI Machine Learning Repository for handling medical dataset classification. Simulation results show that the proposed approach is able to achieve good generalization performance, compared to the results of other classifiers.

  2. Combining human and machine intelligence to derive agents' behavioral rules for groundwater irrigation

    Science.gov (United States)

    Hu, Yao; Quinn, Christopher J.; Cai, Ximing; Garfinkle, Noah W.

    2017-11-01

    For agent-based modeling, the major challenges in deriving agents' behavioral rules arise from agents' bounded rationality and data scarcity. This study proposes a "gray box" approach to address the challenge by incorporating expert domain knowledge (i.e., human intelligence) with machine learning techniques (i.e., machine intelligence). Specifically, we propose using directed information graph (DIG), boosted regression trees (BRT), and domain knowledge to infer causal factors and identify behavioral rules from data. A case study is conducted to investigate farmers' pumping behavior in the Midwest, U.S.A. Results show that four factors identified by the DIG algorithm- corn price, underlying groundwater level, monthly mean temperature and precipitation- have main causal influences on agents' decisions on monthly groundwater irrigation depth. The agent-based model is then developed based on the behavioral rules represented by three DIGs and modeled by BRTs, and coupled with a physically-based groundwater model to investigate the impacts of agents' pumping behavior on the underlying groundwater system in the context of coupled human and environmental systems.

  3. Learning Algorithm of Boltzmann Machine Based on Spatial Monte Carlo Integration Method

    Directory of Open Access Journals (Sweden)

    Muneki Yasuda

    2018-04-01

    Full Text Available The machine learning techniques for Markov random fields are fundamental in various fields involving pattern recognition, image processing, sparse modeling, and earth science, and a Boltzmann machine is one of the most important models in Markov random fields. However, the inference and learning problems in the Boltzmann machine are NP-hard. The investigation of an effective learning algorithm for the Boltzmann machine is one of the most important challenges in the field of statistical machine learning. In this paper, we study Boltzmann machine learning based on the (first-order spatial Monte Carlo integration method, referred to as the 1-SMCI learning method, which was proposed in the author’s previous paper. In the first part of this paper, we compare the method with the maximum pseudo-likelihood estimation (MPLE method using a theoretical and a numerical approaches, and show the 1-SMCI learning method is more effective than the MPLE. In the latter part, we compare the 1-SMCI learning method with other effective methods, ratio matching and minimum probability flow, using a numerical experiment, and show the 1-SMCI learning method outperforms them.

  4. Refractive index sensors based on the fused tapered special multi-mode fiber

    Science.gov (United States)

    Fu, Xing-hu; Xiu, Yan-li; Liu, Qin; Xie, Hai-yang; Yang, Chuan-qing; Zhang, Shun-yang; Fu, Guang-wei; Bi, Wei-hong

    2016-01-01

    In this paper, a novel refractive index (RI) sensor is proposed based on the fused tapered special multi-mode fiber (SMMF). Firstly, a section of SMMF is spliced between two single-mode fibers (SMFs). Then, the SMMF is processed by a fused tapering machine, and a tapered fiber structure is fabricated. Finally, a fused tapered SMMF sensor is obtained for measuring external RI. The RI sensing mechanism of tapered SMMF sensor is analyzed in detail. For different fused tapering lengths, the experimental results show that the RI sensitivity can be up to 444.517 81 nm/RIU in the RI range of 1.334 9—1.347 0. The RI sensitivity is increased with the increase of fused tapering length. Moreover, it has many advantages, including high sensitivity, compact structure, fast response and wide application range. So it can be used to measure the solution concentration in the fields of biochemistry, health care and food processing.

  5. Fuzzy-based multi-kernel spherical support vector machine for ...

    Indian Academy of Sciences (India)

    In the proposed classifier, we design a new multi-kernel function based on the fuzzy triangular membership function. Finally, a newly developed multi-kernel function is incorporated into the spherical support vector machine to enhance the performance significantly. The experimental results are evaluated and performance is ...

  6. Accurate prediction of stability changes in protein mutants by combining machine learning with structure based computational mutagenesis.

    Science.gov (United States)

    Masso, Majid; Vaisman, Iosif I

    2008-09-15

    Accurate predictive models for the impact of single amino acid substitutions on protein stability provide insight into protein structure and function. Such models are also valuable for the design and engineering of new proteins. Previously described methods have utilized properties of protein sequence or structure to predict the free energy change of mutants due to thermal (DeltaDeltaG) and denaturant (DeltaDeltaG(H2O)) denaturations, as well as mutant thermal stability (DeltaT(m)), through the application of either computational energy-based approaches or machine learning techniques. However, accuracy associated with applying these methods separately is frequently far from optimal. We detail a computational mutagenesis technique based on a four-body, knowledge-based, statistical contact potential. For any mutation due to a single amino acid replacement in a protein, the method provides an empirical normalized measure of the ensuing environmental perturbation occurring at every residue position. A feature vector is generated for the mutant by considering perturbations at the mutated position and it's ordered six nearest neighbors in the 3-dimensional (3D) protein structure. These predictors of stability change are evaluated by applying machine learning tools to large training sets of mutants derived from diverse proteins that have been experimentally studied and described. Predictive models based on our combined approach are either comparable to, or in many cases significantly outperform, previously published results. A web server with supporting documentation is available at http://proteins.gmu.edu/automute.

  7. Nuclear Knowledge Innovations Assimilation: The Impact of Organizational Knowledge Frames and Triple Helix Dynamics of Knowledge Base

    International Nuclear Information System (INIS)

    Hossain, M. D.; Sultana, T.

    2016-01-01

    Full text: Previous research did not investigate the impact of the TH dynamics of knowledge innovations on the nuclear knowledge innovations adoption/assimilation in the organizational context. Hence, the recommendation of R&D policy reformulation seems too broad. These gaps are the prime motivators for the research. In the organizational context, we posit that TH dynamics of knowledge base innovation serves as complements to managers’ knowledge frames related to a technology innovation. We examine interactions between three knowledge frames—integration frame, opportunism frame, and policy knowledge frame, and two TH dynamics of knowledge innovations—bilateral TH dynamics of knowledge innovations and trilateral TH dynamics of knowledge innovations, and their relationship with the assimilation of nuclear knowledge innovations. We aim to research on the issues of the dynamics of knowledge base of innovations involving TH collaborations (university, industry and government) in Bangladesh as a new build nuclear project. As a result, we can find out the impact of TH collaborations on organizational nuclear knowledge innovations management as well as core institutional problems of the knowledge base of innovation systems in terms of R&D policy. Finally, findings identify lack in production of nuclear knowledge innovations and concrete recommendation of R&D policy reformulation. (author

  8. Machine learning derived risk prediction of anorexia nervosa.

    Science.gov (United States)

    Guo, Yiran; Wei, Zhi; Keating, Brendan J; Hakonarson, Hakon

    2016-01-20

    Anorexia nervosa (AN) is a complex psychiatric disease with a moderate to strong genetic contribution. In addition to conventional genome wide association (GWA) studies, researchers have been using machine learning methods in conjunction with genomic data to predict risk of diseases in which genetics play an important role. In this study, we collected whole genome genotyping data on 3940 AN cases and 9266 controls from the Genetic Consortium for Anorexia Nervosa (GCAN), the Wellcome Trust Case Control Consortium 3 (WTCCC3), Price Foundation Collaborative Group and the Children's Hospital of Philadelphia (CHOP), and applied machine learning methods for predicting AN disease risk. The prediction performance is measured by area under the receiver operating characteristic curve (AUC), indicating how well the model distinguishes cases from unaffected control subjects. Logistic regression model with the lasso penalty technique generated an AUC of 0.693, while Support Vector Machines and Gradient Boosted Trees reached AUC's of 0.691 and 0.623, respectively. Using different sample sizes, our results suggest that larger datasets are required to optimize the machine learning models and achieve higher AUC values. To our knowledge, this is the first attempt to assess AN risk based on genome wide genotype level data. Future integration of genomic, environmental and family-based information is likely to improve the AN risk evaluation process, eventually benefitting AN patients and families in the clinical setting.

  9. Component based modelling of piezoelectric ultrasonic actuators for machining applications

    International Nuclear Information System (INIS)

    Saleem, A; Ahmed, N; Salah, M; Silberschmidt, V V

    2013-01-01

    Ultrasonically Assisted Machining (UAM) is an emerging technology that has been utilized to improve the surface finishing in machining processes such as turning, milling, and drilling. In this context, piezoelectric ultrasonic transducers are being used to vibrate the cutting tip while machining at predetermined amplitude and frequency. However, modelling and simulation of these transducers is a tedious and difficult task. This is due to the inherent nonlinearities associated with smart materials. Therefore, this paper presents a component-based model of ultrasonic transducers that mimics the nonlinear behaviour of such a system. The system is decomposed into components, a mathematical model of each component is created, and the whole system model is accomplished by aggregating the basic components' model. System parameters are identified using Finite Element technique which then has been used to simulate the system in Matlab/SIMULINK. Various operation conditions are tested and performed to demonstrate the system performance

  10. Incremental Knowledge Acquisition for WSD: A Rough Set and IL based Method

    Directory of Open Access Journals (Sweden)

    Xu Huang

    2015-07-01

    Full Text Available Word sense disambiguation (WSD is one of tricky tasks in natural language processing (NLP as it needs to take into full account all the complexities of language. Because WSD involves in discovering semantic structures from unstructured text, automatic knowledge acquisition of word sense is profoundly difficult. To acquire knowledge about Chinese multi-sense verbs, we introduce an incremental machine learning method which combines rough set method and instance based learning. First, context of a multi-sense verb is extracted into a table; its sense is annotated by a skilled human and stored in the same table. By this way, decision table is formed, and then rules can be extracted within the framework of attributive value reduction of rough set. Instances not entailed by any rule are treated as outliers. When new instances are added to decision table, only the new added and outliers need to be learned further, thus incremental leaning is fulfilled. Experiments show the scale of decision table can be reduced dramatically by this method without performance decline.

  11. Prediction of Banking Systemic Risk Based on Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Shouwei Li

    2013-01-01

    Full Text Available Banking systemic risk is a complex nonlinear phenomenon and has shed light on the importance of safeguarding financial stability by recent financial crisis. According to the complex nonlinear characteristics of banking systemic risk, in this paper we apply support vector machine (SVM to the prediction of banking systemic risk in an attempt to suggest a new model with better explanatory power and stability. We conduct a case study of an SVM-based prediction model for Chinese banking systemic risk and find the experiment results showing that support vector machine is an efficient method in such case.

  12. End user interface and knowledge base editing system of CSPAR: a knowledge-based consultation system for preventive maintenance in nuclear plants

    International Nuclear Information System (INIS)

    Sinohara, Yasusi; Terano, Takao; Nishiyama, Takuya

    1988-01-01

    Consultation System for Prevention of Abnormal-event Recurrence (CSPAR) is a knowledge-based system to analyze the same kind of events to a given fault reported on a nuclear power plant and to give users some informations for effective measures preventing them. This report discusses the interfaces of CSPAR for both end-users and knowledge-base editors. The interfaces are highly interactive and multi-window oriented. The features are as follows: (1) The end-user interfaces has Japanese language processing facility, which enables the users to consult CSPAR with various synonims and related terms for knowledge-base handling; (2) The knowledge-base editing system is used by knowledge-base editors for maintaining the knowledge on both plants' equipments and abnormal events sequences. It has facilities for easy maintenance of knowledge-bases, which includes a graphic oriented browser, a knowledge-base retriever, and a knowledge-base checker. (author)

  13. Learning Activity Packets for Grinding Machines. Unit I--Grinding Machines.

    Science.gov (United States)

    Oklahoma State Board of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This learning activity packet (LAP) is one of three that accompany the curriculum guide on grinding machines. It outlines the study activities and performance tasks for the first unit of this curriculum guide. Its purpose is to aid the student in attaining a working knowledge of this area of training and in achieving a skilled or moderately…

  14. Improving the reliability of stator insulation system in rotating machines

    International Nuclear Information System (INIS)

    Gupta, G.K.; Sedding, H.G.; Culbert, I.M.

    1997-01-01

    Reliable performance of rotating machines, especially generators and primary heat transport pump motors, is critical to the efficient operation on nuclear stations. A significant number of premature machine failures have been attributed to the stator insulation problems. Ontario Hydro has attempted to assure the long term reliability of the insulation system in critical rotating machines through proper specifications and quality assurance tests for new machines and periodic on-line and off-line diagnostic tests on machines in service. The experience gained over the last twenty years is presented in this paper. Functional specifications have been developed for the insulation system in critical rotating machines based on engineering considerations and our past experience. These specifications include insulation stress, insulation resistance and polarization index, partial discharge levels, dissipation factor and tip up, AC and DC hipot tests. Voltage endurance tests are specified for groundwall insulation system of full size production coils and bars. For machines with multi-turn coils, turn insulation strength for fast fronted surges in specified and verified through tests on all coils in the factory and on samples of finished coils in the laboratory. Periodic on-line and off-line diagnostic tests were performed to assess the condition of the stator insulation system in machines in service. Partial discharges are measured on-line using several techniques to detect any excessive degradation of the insulation system in critical machines. Novel sensors have been developed and installed in several machines to facilitate measurements of partial discharges on operating machines. Several off-line tests are performed either to confirm the problems indicated by the on-line test or to assess the insulation system in machines which cannot be easily tested on-line. Experience with these tests, including their capabilities and limitations, are presented. (author)

  15. Maximizing the knowledge base: Knowledge Base+ and the Global Open Knowledgebase

    Directory of Open Access Journals (Sweden)

    Liam Earney

    2013-11-01

    Full Text Available The motivation for the two projects discussed in this article is the simple premise that the current inaccuracies of data in the library supply chain are detrimental to the user experience, limit the ability of institutions to effectively manage their collections and that resolving them is increasingly unsustainable at the institutional level. Two projects, Knowledge Base+ (KB+ in the UK and Global Open Knowledgebase (GOKb in the USA, are working in cooperation with a range of other partners, and adopting a communitycentric approach to address these issues and broaden the scope and utility of knowledge bases more generally. The belief is that only through collaboration at a wide range of levels and on a number of fronts can these challenges be overcome.

  16. Ultra-precision machining induced phase decomposition at surface of Zn-Al based alloy

    International Nuclear Information System (INIS)

    To, S.; Zhu, Y.H.; Lee, W.B.

    2006-01-01

    The microstructural changes and phase transformation of an ultra-precision machined Zn-Al based alloy were examined using X-ray diffraction and back-scattered electron microscopy techniques. Decomposition of the Zn-rich η phase and the related changes in crystal orientation was detected at the surface of the ultra-precision machined alloy specimen. The effects of the machining parameters, such as cutting speed and depth of cut, on the phase decomposition were discussed in comparison with the tensile and rolling induced microstrucutural changes and phase decomposition

  17. A Novel Bearing Fault Diagnosis Method Based on Gaussian Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Xiao-hui He

    2016-01-01

    Full Text Available To realize the fault diagnosis of bearing effectively, this paper presents a novel bearing fault diagnosis method based on Gaussian restricted Boltzmann machine (Gaussian RBM. Vibration signals are firstly resampled to the same equivalent speed. Subsequently, the envelope spectrums of the resampled data are used directly as the feature vectors to represent the fault types of bearing. Finally, in order to deal with the high-dimensional feature vectors based on envelope spectrum, a classifier model based on Gaussian RBM is applied. Gaussian RBM has the ability to provide a closed-form representation of the distribution underlying the training data, and it is very convenient for modeling high-dimensional real-valued data. Experiments on 10 different data sets verify the performance of the proposed method. The superiority of Gaussian RBM classifier is also confirmed by comparing with other classifiers, such as extreme learning machine, support vector machine, and deep belief network. The robustness of the proposed method is also studied in this paper. It can be concluded that the proposed method can realize the bearing fault diagnosis accurately and effectively.

  18. Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation

    Science.gov (United States)

    1989-08-01

    1757 I Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation DTIC5 by flELECTE 5David C. Wilkins and Yong...NUMBERSWOKNI PROGRAM RAT TSWOKUI 61153N RR04206 OC 443g-008 11 TITLE (Include Security Classification) Sociopathic Knowledge Bases: Correct Knowledge Can be...probabilistic rules are shown to be sociopathic and so this problem is very widespread. Sociopathicity has important consequences for rule induction

  19. Scientific Journal Indexing

    Directory of Open Access Journals (Sweden)

    Getulio Teixeira Batista

    2007-08-01

    Full Text Available It is quite impressive the visibility of online publishing compared to offline. Lawrence (2001 computed the percentage increase across 1,494 venues containing at least five offline and five online articles. Results shown an average of 336% more citations to online articles compared to offline articles published in the same venue. If articles published in the same venue are of similar quality, then they concluded that online articles are more highly cited because of their easier access. Thomson Scientific, traditionally concerned with printed journals, announced on November 28, 2005, the launch of Web Citation Index™, the multidisciplinary citation index of scholarly content from institutional and subject-based repositories (http://scientific.thomson. com/press/2005/8298416/. The Web Citation Index from the abstracting and indexing (A&I connects together pre-print articles, institutional repositories and open access (OA journals (Chillingworth, 2005. Basically all research funds are government granted funds, tax payer’s supported and therefore, results should be made freely available to the community. Free online availability facilitates access to research findings, maximizes interaction among research groups, and optimizes efforts and research funds efficiency. Therefore, Ambi-Água is committed to provide free access to its articles. An important aspect of Ambi-Água is the publication and management system of this journal. It uses the Electronic System for Journal Publishing (SEER - http://www.ibict.br/secao.php?cat=SEER. This system was translated and customized by the Brazilian Institute for Science and Technology Information (IBICT based on the software developed by the Public Knowledge Project (Open Journal Systems of the British Columbia University (http://pkp.sfu.ca/ojs/. The big advantage of using this system is that it is compatible with the OAI-PMH protocol for metadata harvesting what greatly promotes published articles

  20. A systematic review on barriers, facilities, knowledge and attitude toward evidence-based medicine in Iran

    Directory of Open Access Journals (Sweden)

    Morteza Ghojazadeh

    2015-03-01

    Full Text Available Introduction: Evidence-based medicine (EBM is the ability and skill in using and integration of the best up-to-date evidences. The aim of this study was a systematic review of barriers, facilities, knowledge and attitude of EBM in Iran. Methods: In this study, database and manual search was used with keywords such as, "evidence-based, EBM, evidence-based nursing, evidence-based practice, evidence-based care, evidence-based activities, evidence-based education" and their combination with the keywords of the barrier, facilitator, attitude, awareness, prospective, knowledge, practice and Iran. The databases of SID (Scientific information database, Magiran, MEDLIB, PubMed, Google scholar, IranMedex and CINAHL (Cumulative index to nursing and allied health literature were used for data collection. Results: Finally, 28 papers were included in this study. The lack of facilities, time and skill in research methodology were the most important barriers to EBM. The most and least important factors were orderly creating ample opportunity and detecting needs and problems. The degree of familiarity with the terminology of evidence-based performance was low (44.2%. The textbooks have been considered as the most significant source of obtaining information. The level of awareness, knowledge, and evidence-based performance was less than 50.0%. Conclusion: There are many various barriers in use of EBM and healthcare providers despite the positive attitude toward EBM had a low level knowledge in EBM setting. Consideration of the importance of EBM proper planning and effective intervention are necessary to removing the barriers and increase the knowledge of healthcare providers.

  1. Development of diagnosis and maintenance support system for nuclear power plants with flexible inference function and knowledge base edition support function

    International Nuclear Information System (INIS)

    Fujii, Makoto; Seki, Eiji; Tai, Ichiro; Morioka, Toshihiko

    1988-01-01

    For the reliable and efficient diagnosis and inspection work of the nuclear power plant equipments, 'Diagnosis and Maintenance Support System' has been developed. This system has functions to assist operators or engineers to observe and evaluate equipment conditions based on the experts' knowledge. These functions are carried out through dialogue between the system and users. This system has two subsystems: diagnosis subsystem and knowledge base edition support subsystem. To achieve the functions of diagnosis subsystem, a new method of knowledge processing for equipment diagnosis is adopted. This method is based on the concept of 'Cause Generation and Checking'. Knowledge for diagnosis is represented with modularized production rules. And each rule module consists of four different type rules with hierarchical structure. With this approach, the system is equipped with sufficient performance not only in diagnosis function but also in flexible man-machine interface. Knowledge base edition support subsystem (Graphical Rule Editor) is provided for this system. This editor has functions to display and edit the contents of knowledge base with tree structures through the graphic display. With these functions, the efficiency of constructing expert system is highly improved. By applying this system to the maintenance support of neutron monitoring system, it is proved that this system has satisfactory performance as a diagnosis and maintenance support system. (author)

  2. LIS Professionals as Knowledge Engineers.

    Science.gov (United States)

    Poulter, Alan; And Others

    1994-01-01

    Considers the role of library and information science professionals as knowledge engineers. Highlights include knowledge acquisition, including personal experience, interviews, protocol analysis, observation, multidimensional sorting, printed sources, and machine learning; knowledge representation, including production rules and semantic nets;…

  3. Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach.

    Science.gov (United States)

    Pasupa, Kitsuchart; Kudisthalert, Wasu

    2018-01-01

    Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets-Maximum Unbiased Validation Dataset-which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6.

  4. Health Care Leadership: Managing Knowledge Bases as Stakeholders.

    Science.gov (United States)

    Rotarius, Timothy

    Communities are composed of many organizations. These organizations naturally form clusters based on common patterns of knowledge, skills, and abilities of the individual organizations. Each of these spontaneous clusters represents a distinct knowledge base. The health care knowledge base is shown to be the natural leader of any community. Using the Central Florida region's 5 knowledge bases as an example, each knowledge base is categorized as a distinct type of stakeholder, and then a specific stakeholder management strategy is discussed to facilitate managing both the cooperative potential and the threatening potential of each "knowledge base" stakeholder.

  5. A Semantic-Based Indexing for Indoor Moving Objects

    OpenAIRE

    Tingting Ben; Xiaolin Qin; Ning Wang

    2014-01-01

    The increasing availability of indoor positioning, driven by techniques like RFID, Bluetooth, and smart phones, enables a variety of indoor location-based services (LBSs). Efficient queries based on semantic-constraint in indoor spaces play an important role in supporting and boosting LBSs. However, the existing indoor index techniques cannot support these semantic constraints-based queries. To solve this problem, this paper addresses the challenge of indexing moving objects in indoor spaces,...

  6. Determination of continuous complex refractive index dispersion of biotissue based on internal reflection

    Science.gov (United States)

    Deng, Zhichao; Wang, Jin; Ye, Qing; Sun, Tengqian; Zhou, Wenyuan; Mei, Jianchun; Zhang, Chunping; Tian, Jianguo

    2016-01-01

    The complex refractive index dispersion (CRID), which contains the information on the refractive index dispersion and extinction coefficient spectra, is an important optical parameter of biotissue. However, it is hard to perform the CRID measurement on biotissues due to their high scattering property. Continuous CRID measurement based on internal reflection (CCRIDM-IR) is introduced. By using a lab-made apparatus, internal reflectance spectra of biotissue samples at multiple incident angles were detected, from which the continuous CRIDs were calculated based on the Fresnel formula. Results showed that in 400- to 750-nm range, hemoglobin solution has complicated dispersion and extinction coefficient spectra, while other biotissues have normal dispersion properties, and their extinction coefficients do not vary much with different wavelengths. The normal dispersion can be accurately described by several coefficients of dispersion equations (Cauchy equation, Cornu equation, and Conrady equation). To our knowledge, this is the first time that the continuous CRID of scattering biotissue over a continuous spectral region is measured, and we hereby have proven that CCRIDM-IR is a good method for continuous CRID research of biotissue.

  7. PRISMA database machine: A distributed, main-memory approach

    NARCIS (Netherlands)

    Schmidt, J.W.; Apers, Peter M.G.; Ceri, S.; Kersten, Martin L.; Oerlemans, Hans C.M.; Missikoff, M.

    1988-01-01

    The PRISMA project is a large-scale research effort in the design and implementation of a highly parallel machine for data and knowledge processing. The PRISMA database machine is a distributed, main-memory database management system implemented in an object-oriented language that runs on top of a

  8. PCA-based polling strategy in machine learning framework for coronary artery disease risk assessment in intravascular ultrasound: A link between carotid and coronary grayscale plaque morphology.

    Science.gov (United States)

    Araki, Tadashi; Ikeda, Nobutaka; Shukla, Devarshi; Jain, Pankaj K; Londhe, Narendra D; Shrivastava, Vimal K; Banchhor, Sumit K; Saba, Luca; Nicolaides, Andrew; Shafique, Shoaib; Laird, John R; Suri, Jasjit S

    2016-05-01

    Percutaneous coronary interventional procedures need advance planning prior to stenting or an endarterectomy. Cardiologists use intravascular ultrasound (IVUS) for screening, risk assessment and stratification of coronary artery disease (CAD). We hypothesize that plaque components are vulnerable to rupture due to plaque progression. Currently, there are no standard grayscale IVUS tools for risk assessment of plaque rupture. This paper presents a novel strategy for risk stratification based on plaque morphology embedded with principal component analysis (PCA) for plaque feature dimensionality reduction and dominant feature selection technique. The risk assessment utilizes 56 grayscale coronary features in a machine learning framework while linking information from carotid and coronary plaque burdens due to their common genetic makeup. This system consists of a machine learning paradigm which uses a support vector machine (SVM) combined with PCA for optimal and dominant coronary artery morphological feature extraction. Carotid artery proven intima-media thickness (cIMT) biomarker is adapted as a gold standard during the training phase of the machine learning system. For the performance evaluation, K-fold cross validation protocol is adapted with 20 trials per fold. For choosing the dominant features out of the 56 grayscale features, a polling strategy of PCA is adapted where the original value of the features is unaltered. Different protocols are designed for establishing the stability and reliability criteria of the coronary risk assessment system (cRAS). Using the PCA-based machine learning paradigm and cross-validation protocol, a classification accuracy of 98.43% (AUC 0.98) with K=10 folds using an SVM radial basis function (RBF) kernel was achieved. A reliability index of 97.32% and machine learning stability criteria of 5% were met for the cRAS. This is the first Computer aided design (CADx) system of its kind that is able to demonstrate the ability of coronary

  9. Support vector machine based battery model for electric vehicles

    International Nuclear Information System (INIS)

    Wang Junping; Chen Quanshi; Cao Binggang

    2006-01-01

    The support vector machine (SVM) is a novel type of learning machine based on statistical learning theory that can map a nonlinear function successfully. As a battery is a nonlinear system, it is difficult to establish the relationship between the load voltage and the current under different temperatures and state of charge (SOC). The SVM is used to model the battery nonlinear dynamics in this paper. Tests are performed on an 80Ah Ni/MH battery pack with the Federal Urban Driving Schedule (FUDS) cycle to set up the SVM model. Compared with the Nernst and Shepherd combined model, the SVM model can simulate the battery dynamics better with small amounts of experimental data. The maximum relative error is 3.61%

  10. Knowledge-based competitiveness indices and its connection with energy indices

    Directory of Open Access Journals (Sweden)

    Katić Andrea V.

    2016-01-01

    Full Text Available Knowledge-based economy has become a major trend in international society in the 21st century. However, today’s strategies place a greater emphasis on sustainability than in the past, while continuing to emphasize the importance of education and its connection with labour market. There has been a re-orientation, where resource, eco-efficiency and innovation have become major elements for achieving national objectives and a relevant level of competitiveness. This article deals with 30 indices, which define the competitiveness of a specific economy, and involve knowledge parameters. They are classified into four main categories and one special category. They are then analysed regarding the participation of Serbia and their availability. The main focus of this paper is to give detailed analyses of energy indices, as a special category of knowledge indexes. It has been shown that Serbia, in many cases, was not included in the study analysis or that there was insufficient information about Serbia’s position. This article shows that only a part of the presented indices includes Serbia. It is concluded that a new, revised model is needed that will include more exact indicators.

  11. Knowledge based Entrepreneurship

    DEFF Research Database (Denmark)

    Heebøll, John

    This book is dedicated enterprising people with a technical or a scientific background who consider commercializing ideas and inventions within their field of expertise via a new business activity or a new company. It aims at distilling experiences from many successful and not so successful start......-up ventures from the Technical University of Denmark, 1988 – 2008 into practical, portable knowledge that can be used by future knowledge-based entrepreneurs to set up new companies efficiently or to stay away from it; to do what’s needed and avoid the pitfalls....

  12. Using Machine Learning for Land Suitability Classification

    African Journals Online (AJOL)

    User

    West African Journal of Applied Ecology, vol. ... evidence for the utility of machine learning methods in land suitability classification especially MCS methods. ... Artificial intelligence tools. ..... Numerical values of index for the various classes.

  13. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    Science.gov (United States)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of

  14. Cultural and Rhetorical Bases for communicating knowledge in web based communities

    DEFF Research Database (Denmark)

    Kampf, Constance; Kommers, Piet

    2008-01-01

    Cultural and Rhetorical Bases for communicating knowledge in web based communities How can we extend learner-centred theories for educational technology to include, for instance, the cultural and rhetorical backgrounds which influence participants in online communities as they engage in knowledge...... via web-based communities the intersection of culture and rhetoric in web-based communication rhetoric and discourse in the process of communicating knowledge via technology heuristics for knowledge communication from teaching in online forums connections between identity and knowledge communication...... This call for papers invites papers focused on theoretical frameworks or empirical research which highlights the cultural and/or rhetorical aspects of communicating knowledge in web based communities. We are looking for work that brings together methods and perspectives across disciplines...

  15. Knowledge management method for knowledge based BWR Core Operation Management System

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Yutaka; Fukuzaki, Takaharu; Kobayashi, Yasuhiro

    1989-03-01

    A knowledge management method is proposed to support an except whose knowledge is stored in a knowledge base in the BWR Core Operation Management System. When the alterations in the operation plans are motivated by the expert after evaluating them, the method attempts to find the knowledge which must be modified and to give the expert guidances. In this way the resultant operation plans are improved by modifying values of referenced data. Using data dependency among data, which are defined and referred during inference, data to be modified are retrieved. In generating modification guidances, data reference and definition procedures are classified by syntactic analysis of knowledge. The modified data values are calculated with a sensitivity between the increment in the data to be modified and the resultant one in the performance of operation plans. The efficiency of the knowledge management by the proposed method, when applied to the knowledge based system including 500 pieces of knowledge for BWR control rod programming, is higher than that for interactive use of existing general purpose editors. (author).

  16. Knowledge management method for knowledge based BWR Core Operation Management System

    International Nuclear Information System (INIS)

    Wada, Yutaka; Fukuzaki, Takaharu; Kobayashi, Yasuhiro

    1989-01-01

    A knowledge management method is proposed to support an except whose knowledge is stored in a knowledge base in the BWR Core Operation Management System. When the alterations in the operation plans are motivated by the expert after evaluating them, the method attempts to find the knowledge which must be modified and to give the expert guidances. In this way the resultant operation plans are improved by modifying values of referenced data. Using data dependency among data, which are defined and referred during inference, data to be modified are retrieved. In generating modification guidances, data reference and definition procedures are classified by syntactic analysis of knowledge. The modified data values are calculated with a sensitivity between the increment in the data to be modified and the resultant one in the performance of operation plans. The efficiency of the knowledge management by the proposed method, when applied to the knowledge based system including 500 pieces of knowledge for BWR control rod programming, is higher than that for interactive use of existing general purpose editors. (author)

  17. An intelligent man-machine system for future nuclear power plants

    International Nuclear Information System (INIS)

    Takizawa, Yoji; Hattori, Yoshiaki; Itoh, Juichiro; Fukumoto, Akira

    1994-01-01

    The objective of the development of an intelligent man-machine system for future nuclear power plants is enhancement of operational reliability by applying recent advances in cognitive science, artificial intelligence, and computer technologies. To realize this objective, the intelligent man-machine system, aiming to support a knowledge-based decision making process in an operator's supervisory plant control tasks, consists of three main functions, i.e., a cognitive model-based advisor, a robust automatic sequence controller, and an ecological interface. These three functions have been integrated into a console-type nuclear power plant monitoring and control system as a validation test bed. The validation tests in which experienced operator crews participated were carried out in 1991 and 1992. The test results show the usefulness of the support functions and the validity of the system design approach

  18. Contribution Index Based on Green Building Certification Systems

    Directory of Open Access Journals (Sweden)

    Yuting Sun

    2015-05-01

    Full Text Available Green Building Certification Systems (GBCS are carried out in many countries due to the rising awareness of the importance of sustainability in the building industry. The intention should have motivated participants to construct and operate buildings sustainably, however, there is not yet a method developed to investigate the motivation of the participants. Based on the GBCS, this paper proposes the contribution index as a standard global method to analyze the performance of participants in the green building industry. Three contribution indices, namely Frequency Contribution Index (FCI, Intensity Contribution Index (ICI and Comprehensive Contribution Index (CCI that concern each different category of participant, have been formulated. Three further analyses based on the index were undertaken to investigate some features of the industry. A case study of Singapore was conducted to show how the contribution index could be used to extract industry patterns and trends and assess the participants’ performance in the green building industry. Interviews with experts provide some suggested applications and support for the findings.

  19. Automated knowledge base development from CAD/CAE databases

    Science.gov (United States)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  20. Development of the Informing Relatives Inventory (IRI): Assessing Index Patients' Knowledge, Motivation and Self-Efficacy Regarding the Disclosure of Hereditary Cancer Risk Information to Relatives.

    Science.gov (United States)

    de Geus, Eveline; Aalfs, Cora M; Menko, Fred H; Sijmons, Rolf H; Verdam, Mathilde G E; de Haes, Hanneke C J M; Smets, Ellen M A

    2015-08-01

    Despite the use of genetic services, counselees do not always share hereditary cancer information with at-risk relatives. Reasons for not informing relatives may be categorized as a lack of: knowledge, motivation, and/or self-efficacy. This study aims to develop and test the psychometric properties of the Informing Relatives Inventory, a battery of instruments that intend to measure counselees' knowledge, motivation, and self-efficacy regarding the disclosure of hereditary cancer risk information to at-risk relatives. Guided by the proposed conceptual framework, existing instruments were selected and new instruments were developed. We tested the instruments' acceptability, dimensionality, reliability, and criterion-related validity in consecutive index patients visiting the Clinical Genetics department with questions regarding hereditary breast and/or ovarian cancer or colon cancer. Data of 211 index patients were included (response rate = 62%). The Informing Relatives Inventory (IRI) assesses three barriers in disclosure representing seven domains. Instruments assessing index patients' (positive) motivation and self-efficacy were acceptable and reliable and suggested good criterion-related validity. Psychometric properties of instruments assessing index patients knowledge were disputable. These items were moderately accepted by index patients and the criterion-related validity was weaker. This study presents a first conceptual framework and associated inventory (IRI) that improves insight into index patients' barriers regarding the disclosure of genetic cancer information to at-risk relatives. Instruments assessing (positive) motivation and self-efficacy proved to be reliable measurements. Measuring index patients knowledge appeared to be more challenging. Further research is necessary to ensure IRI's dimensionality and sensitivity to change.

  1. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  2. Detecting Abnormal Word Utterances in Children With Autism Spectrum Disorders: Machine-Learning-Based Voice Analysis Versus Speech Therapists.

    Science.gov (United States)

    Nakai, Yasushi; Takiguchi, Tetsuya; Matsui, Gakuyo; Yamaoka, Noriko; Takada, Satoshi

    2017-10-01

    Abnormal prosody is often evident in the voice intonations of individuals with autism spectrum disorders. We compared a machine-learning-based voice analysis with human hearing judgments made by 10 speech therapists for classifying children with autism spectrum disorders ( n = 30) and typical development ( n = 51). Using stimuli limited to single-word utterances, machine-learning-based voice analysis was superior to speech therapist judgments. There was a significantly higher true-positive than false-negative rate for machine-learning-based voice analysis but not for speech therapists. Results are discussed in terms of some artificiality of clinician judgments based on single-word utterances, and the objectivity machine-learning-based voice analysis adds to judging abnormal prosody.

  3. A Practical Approach to Constructing a Knowledge Graph for Cybersecurity

    Directory of Open Access Journals (Sweden)

    Yan Jia

    2018-02-01

    Full Text Available Cyberattack forms are complex and varied, and the detection and prediction of dynamic types of attack are always challenging tasks. Research on knowledge graphs is becoming increasingly mature in many fields. At present, it is very significant that certain scholars have combined the concept of the knowledge graph with cybersecurity in order to construct a cybersecurity knowledge base. This paper presents a cybersecurity knowledge base and deduction rules based on a quintuple model. Using machine learning, we extract entities and build ontology to obtain a cybersecurity knowledge base. New rules are then deduced by calculating formulas and using the path-ranking algorithm. The Stanford named entity recognizer (NER is also used to train an extractor to extract useful information. Experimental results show that the Stanford NER provides many features and the useGazettes parameter may be used to train a recognizer in the cybersecurity domain in preparation for future work. Keywords: Cybersecurity, Knowledge graph, Knowledge deduction

  4. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    Science.gov (United States)

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  5. Knowledge-based development in Singapore and Malaysia

    OpenAIRE

    Menkhoff, Thomas; Gerke, Solvay; Evers, Hans-Dieter; Chay, Yue Wah

    2009-01-01

    This paper addresses the question how knowledge is used to benefit the economic development of Singapore and Malaysia. Both countries have followed strict science policies to establish knowledge governance regimes for a knowledge-based economy. On the basis of empirical studies in both countries we show, how ethnic and religious diversity impact on the ability to develop an epistemic culture of knowledge sharing and ultimately an innovative knowledge-based economy.

  6. Knowledge base rule partitioning design for CLIPS

    Science.gov (United States)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  7. Extreme Learning Machine and Moving Least Square Regression Based Solar Panel Vision Inspection

    Directory of Open Access Journals (Sweden)

    Heng Liu

    2017-01-01

    Full Text Available In recent years, learning based machine intelligence has aroused a lot of attention across science and engineering. Particularly in the field of automatic industry inspection, the machine learning based vision inspection plays a more and more important role in defect identification and feature extraction. Through learning from image samples, many features of industry objects, such as shapes, positions, and orientations angles, can be obtained and then can be well utilized to determine whether there is defect or not. However, the robustness and the quickness are not easily achieved in such inspection way. In this work, for solar panel vision inspection, we present an extreme learning machine (ELM and moving least square regression based approach to identify solder joint defect and detect the panel position. Firstly, histogram peaks distribution (HPD and fractional calculus are applied for image preprocessing. Then an ELM-based defective solder joints identification is discussed in detail. Finally, moving least square regression (MLSR algorithm is introduced for solar panel position determination. Experimental results and comparisons show that the proposed ELM and MLSR based inspection method is efficient not only in detection accuracy but also in processing speed.

  8. Knowledge-based engineering of a PLC controlled telescope

    Science.gov (United States)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    As the new control system of the Mercator Telescope is being finalized, we can review some technologies and design methodologies that are advantageous, despite their relative uncommonness in astronomical instrumentation. Particular for the Mercator Telescope is that it is controlled by a single high-end soft-PLC (Programmable Logic Controller). Using off-the-shelf components only, our distributed embedded system controls all subsystems of the telescope such as the pneumatic primary mirror support, the hydrostatic bearing, the telescope axes, the dome, the safety system, and so on. We show how real-time application logic can be written conveniently in typical PLC languages (IEC 61131-3) and in C++ (to implement the pointing kernel) using the commercial TwinCAT 3 programming environment. This software processes the inputs and outputs of the distributed system in real-time via an observatory-wide EtherCAT network, which is synchronized with high precision to an IEEE 1588 (PTP, Precision Time Protocol) time reference clock. Taking full advantage of the ability of soft-PLCs to run both real-time and non real-time software, the same device also hosts the most important user interfaces (HMIs or Human Machine Interfaces) and communication servers (OPC UA for process data, FTP for XML configuration data, and VNC for remote control). To manage the complexity of the system and to streamline the development process, we show how most of the software, electronics and systems engineering aspects of the control system have been modeled as a set of scripts written in a Domain Specific Language (DSL). When executed, these scripts populate a Knowledge Base (KB) which can be queried to retrieve specific information. By feeding the results of those queries to a template system, we were able to generate very detailed "browsable" web-based documentation about the system, but also PLC software code, Python client code, model verification reports, etc. The aim of this paper is to

  9. Effective nationwide school-based participatory extramural program on adolescent body mass index, health knowledge and behaviors.

    Science.gov (United States)

    Heo, Moonseong; Jimenez, Camille C; Lim, Jean; Isasi, Carmen R; Blank, Arthur E; Lounsbury, David W; Fredericks, Lynn; Bouchard, Michelle; Faith, Myles S; Wylie-Rosett, Judith

    2018-01-16

    Adolescent obesity is a major public health concern. Open to all high school students regardless of weight status, HealthCorps is a nationwide program offering a comprehensive high school-based participatory educational program to indirectly address obesity. We tested a hypothesis that the HealthCorps program would decrease BMI z-scores among overweight or obese students, and reduce obesity rates, and evaluated its effects on health knowledge and behaviors. HealthCorps aimed to improve student knowledge and behaviors regarding nutrition quality, physical activity, sleep, breakfast intake, and mental resilience. Participating students received through HealthCorps coordinators weekly or bi-weekly classroom lessons either for a semester or a year in addition to various during- and after-school health-promoting activities and mentorship. Self-reported height and weight were collected along with questionnaires assessing knowledge and behaviors during 2013-2014 academic year among 14 HealthCorps-participating New York City high schools. This quasi experimental two-arm pre-post trial included 611 HealthCorps and 221 comparison arm students for the analytic sample. Sex-specific analyses stratified by weight status were adjusted for age and Hispanic ethnicity with clustering effects of schools and students taken into account. HealthCorps female overweight/obese and obese student had a significant decrease in BMI z-scores (post-pre delta BMI z-score = -0.16 (95%CI = (-0.26, -0.05), p = 0.004 for the former; and = -0.23 (-0.44, -0.03), p = 0.028, for the latter) whereas comparison female counterparts did not. The HealthCorps students, but not the comparison students, had a significant increase for all knowledge domains except for the breakfast realm, and reported a greater number of significant behavior changes including fruit and vegetable intake and physical activities. The HealthCorps program was associated with reduced BMI z-score in overweight/obese and obese

  10. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  11. Evaluation of Hindi to Punjabi Machine Translation System

    OpenAIRE

    Goyal, Vishal; Lehal, Gurpreet Singh

    2009-01-01

    Machine Translation in India is relatively young. The earliest efforts date from the late 80s and early 90s. The success of every system is judged from its evaluation experimental results. Number of machine translation systems has been started for development but to the best of author knowledge, no high quality system has been completed which can be used in real applications. Recently, Punjabi University, Patiala, India has developed Punjabi to Hindi Machine translation system with high accur...

  12. Identification of Forested Landslides Using LiDar Data, Object-based Image Analysis, and Machine Learning Algorithms

    Directory of Open Access Journals (Sweden)

    Xianju Li

    2015-07-01

    Full Text Available For identification of forested landslides, most studies focus on knowledge-based and pixel-based analysis (PBA of LiDar data, while few studies have examined (semi- automated methods and object-based image analysis (OBIA. Moreover, most of them are focused on soil-covered areas with gentle hillslopes. In bedrock-covered mountains with steep and rugged terrain, it is so difficult to identify landslides that there is currently no research on whether combining semi-automated methods and OBIA with only LiDar derivatives could be more effective. In this study, a semi-automatic object-based landslide identification approach was developed and implemented in a forested area, the Three Gorges of China. Comparisons of OBIA and PBA, two different machine learning algorithms and their respective sensitivity to feature selection (FS, were first investigated. Based on the classification result, the landslide inventory was finally obtained according to (1 inclusion of holes encircled by the landslide body; (2 removal of isolated segments, and (3 delineation of closed envelope curves for landslide objects by manual digitizing operation. The proposed method achieved the following: (1 the filter features of surface roughness were first applied for calculating object features, and proved useful; (2 FS improved classification accuracy and reduced features; (3 the random forest algorithm achieved higher accuracy and was less sensitive to FS than a support vector machine; (4 compared to PBA, OBIA was more sensitive to FS, remarkably reduced computing time, and depicted more contiguous terrain segments; (5 based on the classification result with an overall accuracy of 89.11% ± 0.03%, the obtained inventory map was consistent with the referenced landslide inventory map, with a position mismatch value of 9%. The outlined approach would be helpful for forested landslide identification in steep and rugged terrain.

  13. Machine Learning-based Virtual Screening and Its Applications to Alzheimer's Drug Discovery: A Review.

    Science.gov (United States)

    Carpenter, Kristy A; Huang, Xudong

    2018-06-07

    Virtual Screening (VS) has emerged as an important tool in the drug development process, as it conducts efficient in silico searches over millions of compounds, ultimately increasing yields of potential drug leads. As a subset of Artificial Intelligence (AI), Machine Learning (ML) is a powerful way of conducting VS for drug leads. ML for VS generally involves assembling a filtered training set of compounds, comprised of known actives and inactives. After training the model, it is validated and, if sufficiently accurate, used on previously unseen databases to screen for novel compounds with desired drug target binding activity. The study aims to review ML-based methods used for VS and applications to Alzheimer's disease (AD) drug discovery. To update the current knowledge on ML for VS, we review thorough backgrounds, explanations, and VS applications of the following ML techniques: Naïve Bayes (NB), k-Nearest Neighbors (kNN), Support Vector Machines (SVM), Random Forests (RF), and Artificial Neural Networks (ANN). All techniques have found success in VS, but the future of VS is likely to lean more heavily toward the use of neural networks - and more specifically, Convolutional Neural Networks (CNN), which are a subset of ANN that utilize convolution. We additionally conceptualize a work flow for conducting ML-based VS for potential therapeutics of for AD, a complex neurodegenerative disease with no known cure and prevention. This both serves as an example of how to apply the concepts introduced earlier in the review and as a potential workflow for future implementation. Different ML techniques are powerful tools for VS, and they have advantages and disadvantages albeit. ML-based VS can be applied to AD drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Linked data querying through FCA-based schema indexing

    OpenAIRE

    Brosius, Dominik; Staab, Steffen

    2016-01-01

    The effciency of SPARQL query evaluation against Linked Open Data may benefit from schema-based indexing. However, many data items come with incomplete schema information or lack schema descriptions entirely. In this position paper, we outline an approach to an indexing of linked data graphs based on schemata induced through Formal Concept Analysis. We show how to map queries onto RDF graphs based on such derived schema information. We sketch next steps for realizing and optimizing the sugges...

  15. Automated Bug Assignment: Ensemble-based Machine Learning in Large Scale Industrial Contexts

    OpenAIRE

    Jonsson, Leif; Borg, Markus; Broman, David; Sandahl, Kristian; Eldh, Sigrid; Runeson, Per

    2016-01-01

    Bug report assignment is an important part of software maintenance. In particular, incorrect assignments of bug reports to development teams can be very expensive in large software development projects. Several studies propose automating bug assignment techniques using machine learning in open source software contexts, but no study exists for large-scale proprietary projects in industry. The goal of this study is to evaluate automated bug assignment techniques that are based on machine learni...

  16. Light-operated machines based on threaded molecular structures.

    Science.gov (United States)

    Credi, Alberto; Silvi, Serena; Venturi, Margherita

    2014-01-01

    Rotaxanes and related species represent the most common implementation of the concept of artificial molecular machines, because the supramolecular nature of the interactions between the components and their interlocked architecture allow a precise control on the position and movement of the molecular units. The use of light to power artificial molecular machines is particularly valuable because it can play the dual role of "writing" and "reading" the system. Moreover, light-driven machines can operate without accumulation of waste products, and photons are the ideal inputs to enable autonomous operation mechanisms. In appropriately designed molecular machines, light can be used to control not only the stability of the system, which affects the relative position of the molecular components but also the kinetics of the mechanical processes, thereby enabling control on the direction of the movements. This step forward is necessary in order to make a leap from molecular machines to molecular motors.

  17. Web based machine status display for INDUS-1 And INDUS-2

    International Nuclear Information System (INIS)

    Srivastava, B.S.K.; Fatnani, P.

    2003-01-01

    Web based machine status display for Indus-1 and Indus-2 is designated to provide on-line status of Indus-1 and Indus-2 to the clients located at various places of CAT premises. Presently, this system provides Indus-1 machine status (e.g. beam current, integrated current, beam life-time etc) to the users working in Indus-1 building, but using the web browsers the same information can be accessed throughout the CAT network. This system is basically a part of Indus-1 Control System Web Site which is under construction (partially constructed). (author)

  18. Energy Analysis of Contention Tree-Based Access Protocols in Dense Machine-to-Machine Area Networks

    Directory of Open Access Journals (Sweden)

    Francisco Vázquez-Gallego

    2015-01-01

    Full Text Available Machine-to-Machine (M2M area networks aim at connecting an M2M gateway with a large number of energy-constrained devices that must operate autonomously for years. Therefore, attaining high energy efficiency is essential in the deployment of M2M networks. In this paper, we consider a dense M2M area network composed of hundreds or thousands of devices that periodically transmit data upon request from a gateway or coordinator. We theoretically analyse the devices’ energy consumption using two Medium Access Control (MAC protocols which are based on a tree-splitting algorithm to resolve collisions among devices: the Contention Tree Algorithm (CTA and the Distributed Queuing (DQ access. We have carried out computer-based simulations to validate the accuracy of the theoretical models and to compare the energy performance using DQ, CTA, and Frame Slotted-ALOHA (FSA in M2M area networks with devices in compliance with the IEEE 802.15.4 physical layer. Results show that the performance of DQ is totally independent of the number of contending devices, and it can reduce the energy consumed per device in more than 35% with respect to CTA and in more than 80% with respect to FSA.

  19. RBMK full scope simulator gets virtual refuelling machine

    International Nuclear Information System (INIS)

    Khoudiakov, M.; Slonimsky, V.; Mitrofanov, S.

    2006-01-01

    The paper describes a continuation of efforts of an international Russian-Norwegian joint team to drastically increase operational safety during the refuelling process of an RBMK-type reactor by implementing a training simulator based on an innovative Virtual Reality (VR) approach. During the preceding stage of the project a display-based simulator was extended with VR models of the real Refueling Machine (RM) and its environment in order to improve both the learning process and operation's effectiveness. The simulator's challenge is to support the performance (operational activity) of RM operational staff firstly and to take major part in developing basic knowledge and skills as well as to keep skilled staff in close touch with the complex machinery of the Refueling Machine. At the given 2nd stage the functional scope of the VR-simulator was greatly enhanced - firstly, by connecting to the RBMK-unit full-scope simulator, and, secondly, by a training program and simulator model upgrade. (author)

  20. Atomic Force Microscopy Based Cell Shape Index

    Science.gov (United States)

    Adia-Nimuwa, Usienemfon; Mujdat Tiryaki, Volkan; Hartz, Steven; Xie, Kan; Ayres, Virginia

    2013-03-01

    Stellation is a measure of cell physiology and pathology for several cell groups including neural, liver and pancreatic cells. In the present work, we compare the results of a conventional two-dimensional shape index study of both atomic force microscopy (AFM) and fluorescent microscopy images with the results obtained using a new three-dimensional AFM-based shape index similar to sphericity index. The stellation of astrocytes is investigated on nanofibrillar scaffolds composed of electrospun polyamide nanofibers that has demonstrated promise for central nervous system (CNS) repair. Recent work by our group has given us the ability to clearly segment the cells from nanofibrillar scaffolds in AFM images. The clear-featured AFM images indicated that the astrocyte processes were longer than previously identified at 24h. It was furthermore shown that cell spreading could vary significantly as a function of environmental parameters, and that AFM images could record these variations. The new three-dimensional AFM-based shape index incorporates the new information: longer stellate processes and cell spreading. The support of NSF PHY-095776 is acknowledged.

  1. Integrating source-language context into phrase-based statistical machine translation

    NARCIS (Netherlands)

    Haque, R.; Kumar Naskar, S.; Bosch, A.P.J. van den; Way, A.

    2011-01-01

    The translation features typically used in Phrase-Based Statistical Machine Translation (PB-SMT) model dependencies between the source and target phrases, but not among the phrases in the source language themselves. A swathe of research has demonstrated that integrating source context modelling

  2. Machining of {gamma}-TiAl

    Energy Technology Data Exchange (ETDEWEB)

    Aust, E.; Niemann, H.-R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Werkstofforschung

    1999-09-01

    Knowledge of the machining parameters for titanium aluminides of the type {gamma}-TiAl is essential for the acceptance and application of this new heat-resistant light-weight material for high performance components in automobile and aircraft engines. This work evaluates drilling, turning, sawing, milling, electroerosion, grinding, and high-pressure water-jetting of primary castings. The results indicate that there is a potential for each machining process, but a high quality of surface finish can only be achieved by some of the processes. (orig.)

  3. Programming and machining of complex parts based on CATIA solid modeling

    Science.gov (United States)

    Zhu, Xiurong

    2017-09-01

    The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.

  4. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    Science.gov (United States)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  5. Wireless sEMG-Based Body-Machine Interface for Assistive Technology Devices.

    Science.gov (United States)

    Fall, Cheikh Latyr; Gagnon-Turcotte, Gabriel; Dube, Jean-Francois; Gagne, Jean Simon; Delisle, Yanick; Campeau-Lecours, Alexandre; Gosselin, Clement; Gosselin, Benoit

    2017-07-01

    Assistive technology (AT) tools and appliances are being more and more widely used and developed worldwide to improve the autonomy of people living with disabilities and ease the interaction with their environment. This paper describes an intuitive and wireless surface electromyography (sEMG) based body-machine interface for AT tools. Spinal cord injuries at C5-C8 levels affect patients' arms, forearms, hands, and fingers control. Thus, using classical AT control interfaces (keypads, joysticks, etc.) is often difficult or impossible. The proposed system reads the AT users' residual functional capacities through their sEMG activity, and converts them into appropriate commands using a threshold-based control algorithm. It has proven to be suitable as a control alternative for assistive devices and has been tested with the JACO arm, an articulated assistive device of which the vocation is to help people living with upper-body disabilities in their daily life activities. The wireless prototype, the architecture of which is based on a 3-channel sEMG measurement system and a 915-MHz wireless transceiver built around a low-power microcontroller, uses low-cost off-the-shelf commercial components. The embedded controller is compared with JACO's regular joystick-based interface, using combinations of forearm, pectoral, masseter, and trapeze muscles. The measured index of performance values is 0.88, 0.51, and 0.41 bits/s, respectively, for correlation coefficients with the Fitt's model of 0.75, 0.85, and 0.67. These results demonstrate that the proposed controller offers an attractive alternative to conventional interfaces, such as joystick devices, for upper-body disabled people using ATs such as JACO.

  6. Comparison of Different Machine Learning Approaches for Monthly Satellite-Based Soil Moisture Downscaling over Northeast China

    Directory of Open Access Journals (Sweden)

    Yangxiaoyue Liu

    2017-12-01

    Full Text Available Although numerous satellite-based soil moisture (SM products can provide spatiotemporally continuous worldwide datasets, they can hardly be employed in characterizing fine-grained regional land surface processes, owing to their coarse spatial resolution. In this study, we proposed a machine-learning-based method to enhance SM spatial accuracy and improve the availability of SM data. Four machine learning algorithms, including classification and regression trees (CART, K-nearest neighbors (KNN, Bayesian (BAYE, and random forests (RF, were implemented to downscale the monthly European Space Agency Climate Change Initiative (ESA CCI SM product from 25-km to 1-km spatial resolution. During the regression, the land surface temperature (including daytime temperature, nighttime temperature, and diurnal fluctuation temperature, normalized difference vegetation index, surface reflections (red band, blue band, NIR band and MIR band, and digital elevation model were taken as explanatory variables to produce fine spatial resolution SM. We chose Northeast China as the study area and acquired corresponding SM data from 2003 to 2012 in unfrozen seasons. The reconstructed SM datasets were validated against in-situ measurements. The results showed that the RF-downscaled results had superior matching performance to both ESA CCI SM and in-situ measurements, and can positively respond to precipitation variation. Additionally, the RF was less affected by parameters, which revealed its robustness. Both CART and KNN ranked second. Compared to KNN, CART had a relatively close correlation with the validation data, but KNN showed preferable precision. Moreover, BAYE ranked last with significantly abnormal regression values.

  7. An Algebraic Approach to Knowledge Bases Informational Equivalence

    OpenAIRE

    Plotkin, B.; Plotkin, T.

    2003-01-01

    In this paper we study the notion of knowledge from the positions of universal algebra and algebraic logic. We consider first order knowledge which is based on first order logic. We define categories of knowledge and knowledge bases. These notions are defined for the fixed subject of knowledge. The key notion of informational equivalence of two knowledge bases is introduced. We use the idea of equivalence of categories in this definition. We prove that for finite models there is a clear way t...

  8. Knowledge based word-concept model estimation and refinement for biomedical text mining.

    Science.gov (United States)

    Jimeno Yepes, Antonio; Berlanga, Rafael

    2015-02-01

    Text mining of scientific literature has been essential for setting up large public biomedical databases, which are being widely used by the research community. In the biomedical domain, the existence of a large number of terminological resources and knowledge bases (KB) has enabled a myriad of machine learning methods for different text mining related tasks. Unfortunately, KBs have not been devised for text mining tasks but for human interpretation, thus performance of KB-based methods is usually lower when compared to supervised machine learning methods. The disadvantage of supervised methods though is they require labeled training data and therefore not useful for large scale biomedical text mining systems. KB-based methods do not have this limitation. In this paper, we describe a novel method to generate word-concept probabilities from a KB, which can serve as a basis for several text mining tasks. This method not only takes into account the underlying patterns within the descriptions contained in the KB but also those in texts available from large unlabeled corpora such as MEDLINE. The parameters of the model have been estimated without training data. Patterns from MEDLINE have been built using MetaMap for entity recognition and related using co-occurrences. The word-concept probabilities were evaluated on the task of word sense disambiguation (WSD). The results showed that our method obtained a higher degree of accuracy than other state-of-the-art approaches when evaluated on the MSH WSD data set. We also evaluated our method on the task of document ranking using MEDLINE citations. These results also showed an increase in performance over existing baseline retrieval approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Machine performance assessment and enhancement for a hexapod machine

    Energy Technology Data Exchange (ETDEWEB)

    Mou, J.I. [Arizona State Univ., Tempe, AZ (United States); King, C. [Sandia National Labs., Livermore, CA (United States). Integrated Manufacturing Systems Center

    1998-03-19

    The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess the status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.

  10. SAD-Based Stereo Vision Machine on a System-on-Programmable-Chip (SoPC)

    Science.gov (United States)

    Zhang, Xiang; Chen, Zhangwei

    2013-01-01

    This paper, proposes a novel solution for a stereo vision machine based on the System-on-Programmable-Chip (SoPC) architecture. The SOPC technology provides great convenience for accessing many hardware devices such as DDRII, SSRAM, Flash, etc., by IP reuse. The system hardware is implemented in a single FPGA chip involving a 32-bit Nios II microprocessor, which is a configurable soft IP core in charge of managing the image buffer and users' configuration data. The Sum of Absolute Differences (SAD) algorithm is used for dense disparity map computation. The circuits of the algorithmic module are modeled by the Matlab-based DSP Builder. With a set of configuration interfaces, the machine can process many different sizes of stereo pair images. The maximum image size is up to 512 K pixels. This machine is designed to focus on real time stereo vision applications. The stereo vision machine offers good performance and high efficiency in real time. Considering a hardware FPGA clock of 90 MHz, 23 frames of 640 × 480 disparity maps can be obtained in one second with 5 × 5 matching window and maximum 64 disparity pixels. PMID:23459385

  11. Learning Algorithms for Audio and Video Processing: Independent Component Analysis and Support Vector Machine Based Approaches

    National Research Council Canada - National Science Library

    Qi, Yuan

    2000-01-01

    In this thesis, we propose two new machine learning schemes, a subband-based Independent Component Analysis scheme and a hybrid Independent Component Analysis/Support Vector Machine scheme, and apply...

  12. Comparative analysis of partial imaging performance parameters of home and imported X-ray machines

    International Nuclear Information System (INIS)

    Cao Yunxi; Wang Xianyun; Liu Huiqin; Guo Yongxin

    2002-01-01

    Objective: To compare and analyze the performance indexes and the imaging quality of the home and imported X-ray machines through testing their partial imaging performance parameters. Methods: By separate sampling from 10 home and 10 imported X-ray machines, the parameters including tube current, time of exposure, machine total exposure, and repeatability were tested, and the imaging performance was evaluated according to the national standard. Results: All the performance indexes met the standard of GB4505-84. The first sampling tests showed the maximum changing coefficient of imaging performance repeatability of the home X-ray machines was Δmax1 = 0.025, while that of the imported X-ray machine was Δmax1 = 0.016. In the second sampling tests, the maximum changing coefficients of the two were Δmax2 = 0.048 and Δmax2 = 0.022, respectively. Conclusion: The 2 years' follow-up tests indicate that there is no significant difference between the above-mentioned parameters of the elaborately adjusted home X-ray machines and imported ones, but the home X-ray machines are no better than the imported X-ray machines in stability and consistency

  13. Pressure Prediction of Coal Slurry Transportation Pipeline Based on Particle Swarm Optimization Kernel Function Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Xue-cun Yang

    2015-01-01

    Full Text Available For coal slurry pipeline blockage prediction problem, through the analysis of actual scene, it is determined that the pressure prediction from each measuring point is the premise of pipeline blockage prediction. Kernel function of support vector machine is introduced into extreme learning machine, the parameters are optimized by particle swarm algorithm, and blockage prediction method based on particle swarm optimization kernel function extreme learning machine (PSOKELM is put forward. The actual test data from HuangLing coal gangue power plant are used for simulation experiments and compared with support vector machine prediction model optimized by particle swarm algorithm (PSOSVM and kernel function extreme learning machine prediction model (KELM. The results prove that mean square error (MSE for the prediction model based on PSOKELM is 0.0038 and the correlation coefficient is 0.9955, which is superior to prediction model based on PSOSVM in speed and accuracy and superior to KELM prediction model in accuracy.

  14. Coldness production and heat revalorization: particular machines; Production de froid et revalorisation de la chaleur: machines particulieres

    Energy Technology Data Exchange (ETDEWEB)

    Feidt, M. [Universite Henri Poincare - Nancy-1, 54 - Nancy (France)

    2003-10-01

    The machines presented in this article are not the common reverse cycle machines. They use some systems based on different physical principles which have some consequences on the analysis of cycles: 1 - permanent gas machines (thermal separators, pulse gas tube, thermal-acoustic machines); 2 - phase change machines (mechanical vapor compression machines, absorption machines, ejection machines, adsorption machines); 3 - thermoelectric machines (thermoelectric effects, thermodynamic model of a thermoelectric machine). (J.S.)

  15. Research into Financial Position of Listed Companies following Classification via Extreme Learning Machine Based upon DE Optimization

    OpenAIRE

    Fu Yu; Mu Jiong; Duan Xu Liang

    2016-01-01

    By means of the model of extreme learning machine based upon DE optimization, this article particularly centers on the optimization thinking of such a model as well as its application effect in the field of listed company’s financial position classification. It proves that the improved extreme learning machine algorithm based upon DE optimization eclipses the traditional extreme learning machine algorithm following comparison. Meanwhile, this article also intends to introduce certain research...

  16. Machine Learning wins the Higgs Challenge

    CERN Multimedia

    Abha Eli Phoboo

    2014-01-01

    The winner of the four-month-long Higgs Machine Learning Challenge, launched on 12 May, is Gábor Melis from Hungary, followed closely by Tim Salimans from the Netherlands and Pierre Courtiol from France. The challenge explored the potential of advanced machine learning methods to improve the significance of the Higgs discovery.   Winners of the Higgs Machine Learning Challenge: Gábor Melis and Tim Salimans (top row), Tianqi Chen and Tong He (bottom row). Participants in the Higgs Machine Learning Challenge were tasked with developing an algorithm to improve the detection of Higgs boson signal events decaying into two tau particles in a sample of simulated ATLAS data* that contains few signal and a majority of non-Higgs boson “background” events. No knowledge of particle physics was required for the challenge but skills in machine learning - the training of computers to recognise patterns in data – were essential. The Challenge, hosted by Ka...

  17. Architecture Knowledge for Evaluating Scalable Databases

    Science.gov (United States)

    2015-01-16

    Architecture Knowledge for Evaluating Scalable Databases 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Nurgaliev... Scala , Erlang, Javascript Cursor-based queries Supported, Not Supported JOIN queries Supported, Not Supported Complex data types Lists, maps, sets...is therefore needed, using technology such as machine learning to extract content from product documentation. The terminology used in the database

  18. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  19. Medical subdomain classification of clinical notes using a machine learning-based natural language processing approach.

    Science.gov (United States)

    Weng, Wei-Hung; Wagholikar, Kavishwar B; McCray, Alexa T; Szolovits, Peter; Chueh, Henry C

    2017-12-01

    The medical subdomain of a clinical note, such as cardiology or neurology, is useful content-derived metadata for developing machine learning downstream applications. To classify the medical subdomain of a note accurately, we have constructed a machine learning-based natural language processing (NLP) pipeline and developed medical subdomain classifiers based on the content of the note. We constructed the pipeline using the clinical NLP system, clinical Text Analysis and Knowledge Extraction System (cTAKES), the Unified Medical Language System (UMLS) Metathesaurus, Semantic Network, and learning algorithms to extract features from two datasets - clinical notes from Integrating Data for Analysis, Anonymization, and Sharing (iDASH) data repository (n = 431) and Massachusetts General Hospital (MGH) (n = 91,237), and built medical subdomain classifiers with different combinations of data representation methods and supervised learning algorithms. We evaluated the performance of classifiers and their portability across the two datasets. The convolutional recurrent neural network with neural word embeddings trained-medical subdomain classifier yielded the best performance measurement on iDASH and MGH datasets with area under receiver operating characteristic curve (AUC) of 0.975 and 0.991, and F1 scores of 0.845 and 0.870, respectively. Considering better clinical interpretability, linear support vector machine-trained medical subdomain classifier using hybrid bag-of-words and clinically relevant UMLS concepts as the feature representation, with term frequency-inverse document frequency (tf-idf)-weighting, outperformed other shallow learning classifiers on iDASH and MGH datasets with AUC of 0.957 and 0.964, and F1 scores of 0.932 and 0.934 respectively. We trained classifiers on one dataset, applied to the other dataset and yielded the threshold of F1 score of 0.7 in classifiers for half of the medical subdomains we studied. Our study shows that a supervised

  20. Tools for the Knowledge-Based Organization

    DEFF Research Database (Denmark)

    Ravn, Ib

    2002-01-01

    exist. They include a Master’s degree in knowledge management, a web- or print-based intelligence hub for the knowledge society, collaboration with the Danish intellectual capital reporting project, ongoing research on expertise and ethics in knowledge workers, a comparative study of competence...

  1. Neuro-symbolic representation learning on biological knowledge graphs

    KAUST Repository

    Alshahrani, Mona

    2017-04-21

    Biological data and knowledge bases increasingly rely on Semantic Web technologies and the use of knowledge graphs for data integration, retrieval and federated queries. In the past years, feature learning methods that are applicable to graph-structured data are becoming available, but have not yet widely been applied and evaluated on structured biological knowledge.We develop a novel method for feature learning on biological knowledge graphs. Our method combines symbolic methods, in particular knowledge representation using symbolic logic and automated reasoning, with neural networks to generate embeddings of nodes that encode for related information within knowledge graphs. Through the use of symbolic logic, these embeddings contain both explicit and implicit information. We apply these embeddings to the prediction of edges in the knowledge graph representing problems of function prediction, finding candidate genes of diseases, protein-protein interactions, or drug target relations, and demonstrate performance that matches and sometimes outperforms traditional approaches based on manually crafted features. Our method can be applied to any biological knowledge graph, and will thereby open up the increasing amount of SemanticWeb based knowledge bases in biology to use in machine learning and data analytics.https://github.com/bio-ontology-research-group/walking-rdf-and-owl.robert.hoehndorf@kaust.edu.sa.Supplementary data are available at Bioinformatics online.

  2. Optimization of paper machine heat recovery system; Paperikoneen laemmoentalteenottosysteemin optimointi - PMSY 02

    Energy Technology Data Exchange (ETDEWEB)

    Pettersson, H. [Valmet Oyj Pansio, Turku (Finland)

    1998-12-31

    Conventionally the energy content of paper and board machine dryer section exhaust air is recovered in the heat recovery tower. This has had a major contribution to the overall energy economy of a paper machine. Modern paper machines have already reached momentary record speeds above 1700 m/min, and speeds above 2000 m/min will be strived to. This is possible by developing new efficient drying technologies. These will require new solutions for the heat recovery systems. At the same time requirements for new heat recovery solutions come from the gradually closing of paper mill water circulation systems. In this project a discrete tool based on optimization is developed, a tool for analyzing, optimizing and dimensioning of paper machine heat recovery systems for different process conditions. Delivery of a paper machine process requires more and more transferring of process knowledge into calculation model parameters. The overall target of the tool is to decrease the energy consumption considering new drying technologies and the gradually closing of water circulation systems. (orig.)

  3. Optimization of paper machine heat recovery system; Paperikoneen laemmoentalteenottosysteemin optimointi - PMSY 02

    Energy Technology Data Exchange (ETDEWEB)

    Pettersson, H [Valmet Oyj Pansio, Turku (Finland)

    1999-12-31

    Conventionally the energy content of paper and board machine dryer section exhaust air is recovered in the heat recovery tower. This has had a major contribution to the overall energy economy of a paper machine. Modern paper machines have already reached momentary record speeds above 1700 m/min, and speeds above 2000 m/min will be strived to. This is possible by developing new efficient drying technologies. These will require new solutions for the heat recovery systems. At the same time requirements for new heat recovery solutions come from the gradually closing of paper mill water circulation systems. In this project a discrete tool based on optimization is developed, a tool for analyzing, optimizing and dimensioning of paper machine heat recovery systems for different process conditions. Delivery of a paper machine process requires more and more transferring of process knowledge into calculation model parameters. The overall target of the tool is to decrease the energy consumption considering new drying technologies and the gradually closing of water circulation systems. (orig.)

  4. Virtual Things for Machine Learning Applications

    OpenAIRE

    Bovet , Gérôme; Ridi , Antonio; Hennebert , Jean

    2014-01-01

    International audience; Internet-of-Things (IoT) devices, especially sensors are pro-ducing large quantities of data that can be used for gather-ing knowledge. In this field, machine learning technologies are increasingly used to build versatile data-driven models. In this paper, we present a novel architecture able to ex-ecute machine learning algorithms within the sensor net-work, presenting advantages in terms of privacy and data transfer efficiency. We first argument that some classes of ...

  5. SHRIF, a General-Purpose System for Heuristic Retrieval of Information and Facts, Applied to Medical Knowledge Processing.

    Science.gov (United States)

    Findler, Nicholas V.; And Others

    1992-01-01

    Describes SHRIF, a System for Heuristic Retrieval of Information and Facts, and the medical knowledge base that was used in its development. Highlights include design decisions; the user-machine interface, including the language processor; and the organization of the knowledge base in an artificial intelligence (AI) project like this one. (57…

  6. Signal Detection for QPSK Based Cognitive Radio Systems using Support Vector Machines

    Directory of Open Access Journals (Sweden)

    M. T. Mushtaq

    2015-04-01

    Full Text Available Cognitive radio based network enables opportunistic dynamic spectrum access by sensing, adopting and utilizing the unused portion of licensed spectrum bands. Cognitive radio is intelligent enough to adapt the communication parameters of the unused licensed spectrum. Spectrum sensing is one of the most important tasks of the cognitive radio cycle. In this paper, the auto-correlation function kernel based Support Vector Machine (SVM classifier along with Welch's Periodogram detector is successfully implemented for the detection of four QPSK (Quadrature Phase Shift Keying based signals propagating through an AWGN (Additive White Gaussian Noise channel. It is shown that the combination of statistical signal processing and machine learning concepts improve the spectrum sensing process and spectrum sensing is possible even at low Signal to Noise Ratio (SNR values up to -50 dB.

  7. Medical subdomain classification of clinical notes using a machine learning-based natural language processing approach

    OpenAIRE

    Weng, Wei-Hung; Wagholikar, Kavishwar B.; McCray, Alexa T.; Szolovits, Peter; Chueh, Henry C.

    2017-01-01

    Background The medical subdomain of a clinical note, such as cardiology or neurology, is useful content-derived metadata for developing machine learning downstream applications. To classify the medical subdomain of a note accurately, we have constructed a machine learning-based natural language processing (NLP) pipeline and developed medical subdomain classifiers based on the content of the note. Methods We constructed the pipeline using the clinical ...

  8. Behavioral simulation of a nuclear power plant operator crew for human-machine system design

    International Nuclear Information System (INIS)

    Furuta, K.; Shimada, T.; Kondo, S.

    1999-01-01

    This article proposes an architecture of behavioral simulation of an operator crew in a nuclear power plant including group processes and interactions between the operators and their working environment. An operator model was constructed based on the conceptual human information processor and then substantiated as a knowledge-based system with multiple sets of knowledge base and blackboard, each of which represents an individual operator. From a trade-off between reality and practicality, we adopted an architecture of simulation that consists of the operator, plant and environment models in order to consider operator-environment interactions. The simulation system developed on this framework and called OCCS was tested using a scenario of BWR plant operation. The case study showed that operator-environment interactions have significant effects on operator crew performance and that they should be considered properly for simulating behavior of human-machine systems. The proposed architecture contributed to more realistic simulation in comparison with an experimental result, and a good prospect has been obtained that computer simulation of an operator crew is feasible and useful for human-machine system design. (orig.)

  9. Are there intelligent Turing machines?

    OpenAIRE

    Bátfai, Norbert

    2015-01-01

    This paper introduces a new computing model based on the cooperation among Turing machines called orchestrated machines. Like universal Turing machines, orchestrated machines are also designed to simulate Turing machines but they can also modify the original operation of the included Turing machines to create a new layer of some kind of collective behavior. Using this new model we can define some interested notions related to cooperation ability of Turing machines such as the intelligence quo...

  10. OFDM with Index Modulation for Asynchronous mMTC Networks.

    Science.gov (United States)

    Doğan, Seda; Tusha, Armed; Arslan, Hüseyin

    2018-04-21

    One of the critical missions for next-generation wireless communication systems is to fulfill the high demand for massive Machine-Type Communications (mMTC). In mMTC systems, a sporadic transmission is performed between machine users and base station (BS). Lack of coordination between the users and BS in time destroys orthogonality between the subcarriers, and causes inter-carrier interference (ICI). Therefore, providing services to asynchronous massive machine users is a major challenge for Orthogonal Frequency Division Multiplexing (OFDM). In this study, OFDM with index modulation (OFDM-IM) is proposed as an eligible solution to alleviate ICI caused by asynchronous transmission in uncoordinated mMTC networks. In OFDM-IM, data transmission is performed not only by modulated subcarriers but also by the indices of active subcarriers. Unlike classical OFDM, fractional subcarrier activation leads to less ICI in OFDM-IM technology. A novel subcarrier mapping scheme (SMS) named as Inner Subcarrier Activation is proposed to further alleviate adjacent user interference in asynchronous OFDM-IM-based systems. ISA reduces inter-user interference since it gives more activation priority to inner subcarriers compared with the existing SMS-s. The superiority of the proposed SMS is shown through both theoretical analysis and computer-based simulations in comparison to existing mapping schemes for asynchronous systems.

  11. Research into Financial Position of Listed Companies following Classification via Extreme Learning Machine Based upon DE Optimization

    Directory of Open Access Journals (Sweden)

    Fu Yu

    2016-01-01

    Full Text Available By means of the model of extreme learning machine based upon DE optimization, this article particularly centers on the optimization thinking of such a model as well as its application effect in the field of listed company’s financial position classification. It proves that the improved extreme learning machine algorithm based upon DE optimization eclipses the traditional extreme learning machine algorithm following comparison. Meanwhile, this article also intends to introduce certain research thinking concerning extreme learning machine into the economics classification area so as to fulfill the purpose of computerizing the speedy but effective evaluation of massive financial statements of listed companies pertain to different classes

  12. Design Methodology of a Brushless IPM Machine for a Zero Speed Injection Based Sensorless Control

    OpenAIRE

    Godbehere, Jonathan; Wrobel, Rafal; Drury, David; Mellor, Phil

    2015-01-01

    In this paper a design approach for a sensorless controlled, brushless, interior permanent magnet machine is attained. An initial study based on established electrical machine formulas provides the machine’s basic geometrical sizing. The next design stage combines a particle swarm optimisation (PSO) search routine with a magneto-static finite element (FE) solver to provide a more in depth optimisation. The optimisation system has been formulated to derive alternative machine design variants, ...

  13. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  14. Efficient Hybrid Genetic Based Multi Dimensional Host Load Aware Algorithm for Scheduling and Optimization of Virtual Machines

    OpenAIRE

    Thiruvenkadam, T; Karthikeyani, V

    2014-01-01

    Mapping the virtual machines to the physical machines cluster is called the VM placement. Placing the VM in the appropriate host is necessary for ensuring the effective resource utilization and minimizing the datacenter cost as well as power. Here we present an efficient hybrid genetic based host load aware algorithm for scheduling and optimization of virtual machines in a cluster of Physical hosts. We developed the algorithm based on two different methods, first initial VM packing is done by...

  15. Adaptive Knowledge Management of Project-Based Learning

    Science.gov (United States)

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  16. Layout Design of Human-Machine Interaction Interface of Cabin Based on Cognitive Ergonomics and GA-ACA

    Directory of Open Access Journals (Sweden)

    Li Deng

    2016-01-01

    Full Text Available In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method.

  17. Layout Design of Human-Machine Interaction Interface of Cabin Based on Cognitive Ergonomics and GA-ACA.

    Science.gov (United States)

    Deng, Li; Wang, Guohua; Yu, Suihuai

    2016-01-01

    In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method.

  18. Knowledge-based biomedical word sense disambiguation: comparison of approaches

    Directory of Open Access Journals (Sweden)

    Aronson Alan R

    2010-11-01

    Full Text Available Abstract Background Word sense disambiguation (WSD algorithms attempt to select the proper sense of ambiguous terms in text. Resources like the UMLS provide a reference thesaurus to be used to annotate the biomedical literature. Statistical learning approaches have produced good results, but the size of the UMLS makes the production of training data infeasible to cover all the domain. Methods We present research on existing WSD approaches based on knowledge bases, which complement the studies performed on statistical learning. We compare four approaches which rely on the UMLS Metathesaurus as the source of knowledge. The first approach compares the overlap of the context of the ambiguous word to the candidate senses based on a representation built out of the definitions, synonyms and related terms. The second approach collects training data for each of the candidate senses to perform WSD based on queries built using monosemous synonyms and related terms. These queries are used to retrieve MEDLINE citations. Then, a machine learning approach is trained on this corpus. The third approach is a graph-based method which exploits the structure of the Metathesaurus network of relations to perform unsupervised WSD. This approach ranks nodes in the graph according to their relative structural importance. The last approach uses the semantic types assigned to the concepts in the Metathesaurus to perform WSD. The context of the ambiguous word and semantic types of the candidate concepts are mapped to Journal Descriptors. These mappings are compared to decide among the candidate concepts. Results are provided estimating accuracy of the different methods on the WSD test collection available from the NLM. Conclusions We have found that the last approach achieves better results compared to the other methods. The graph-based approach, using the structure of the Metathesaurus network to estimate the relevance of the Metathesaurus concepts, does not perform well

  19. Monitoring Knowledge Base (MKB)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial...

  20. Improved method for SNR prediction in machine-learning-based test

    NARCIS (Netherlands)

    Sheng, Xiaoqin; Kerkhoff, Hans G.

    2010-01-01

    This paper applies an improved method for testing the signal-to-noise ratio (SNR) of Analogue-to-Digital Converters (ADC). In previous work, a noisy and nonlinear pulse signal is exploited as the input stimulus to obtain the signature results of ADC. By applying a machine-learning-based approach,

  1. A Virtual Machine Migration Strategy Based on Time Series Workload Prediction Using Cloud Model

    Directory of Open Access Journals (Sweden)

    Yanbing Liu

    2014-01-01

    Full Text Available Aimed at resolving the issues of the imbalance of resources and workloads at data centers and the overhead together with the high cost of virtual machine (VM migrations, this paper proposes a new VM migration strategy which is based on the cloud model time series workload prediction algorithm. By setting the upper and lower workload bounds for host machines, forecasting the tendency of their subsequent workloads by creating a workload time series using the cloud model, and stipulating a general VM migration criterion workload-aware migration (WAM, the proposed strategy selects a source host machine, a destination host machine, and a VM on the source host machine carrying out the task of the VM migration. Experimental results and analyses show, through comparison with other peer research works, that the proposed method can effectively avoid VM migrations caused by momentary peak workload values, significantly lower the number of VM migrations, and dynamically reach and maintain a resource and workload balance for virtual machines promoting an improved utilization of resources in the entire data center.

  2. Rotating electrical machines

    CERN Document Server

    Le Doeuff, René

    2013-01-01

    In this book a general matrix-based approach to modeling electrical machines is promulgated. The model uses instantaneous quantities for key variables and enables the user to easily take into account associations between rotating machines and static converters (such as in variable speed drives).   General equations of electromechanical energy conversion are established early in the treatment of the topic and then applied to synchronous, induction and DC machines. The primary characteristics of these machines are established for steady state behavior as well as for variable speed scenarios. I

  3. Differentiation of Enhancing Glioma and Primary Central Nervous System Lymphoma by Texture-Based Machine Learning.

    Science.gov (United States)

    Alcaide-Leon, P; Dufort, P; Geraldo, A F; Alshafai, L; Maralani, P J; Spears, J; Bharatha, A

    2017-06-01

    Accurate preoperative differentiation of primary central nervous system lymphoma and enhancing glioma is essential to avoid unnecessary neurosurgical resection in patients with primary central nervous system lymphoma. The purpose of the study was to evaluate the diagnostic performance of a machine-learning algorithm by using texture analysis of contrast-enhanced T1-weighted images for differentiation of primary central nervous system lymphoma and enhancing glioma. Seventy-one adult patients with enhancing gliomas and 35 adult patients with primary central nervous system lymphomas were included. The tumors were manually contoured on contrast-enhanced T1WI, and the resulting volumes of interest were mined for textural features and subjected to a support vector machine-based machine-learning protocol. Three readers classified the tumors independently on contrast-enhanced T1WI. Areas under the receiver operating characteristic curves were estimated for each reader and for the support vector machine classifier. A noninferiority test for diagnostic accuracy based on paired areas under the receiver operating characteristic curve was performed with a noninferiority margin of 0.15. The mean areas under the receiver operating characteristic curve were 0.877 (95% CI, 0.798-0.955) for the support vector machine classifier; 0.878 (95% CI, 0.807-0.949) for reader 1; 0.899 (95% CI, 0.833-0.966) for reader 2; and 0.845 (95% CI, 0.757-0.933) for reader 3. The mean area under the receiver operating characteristic curve of the support vector machine classifier was significantly noninferior to the mean area under the curve of reader 1 ( P = .021), reader 2 ( P = .035), and reader 3 ( P = .007). Support vector machine classification based on textural features of contrast-enhanced T1WI is noninferior to expert human evaluation in the differentiation of primary central nervous system lymphoma and enhancing glioma. © 2017 by American Journal of Neuroradiology.

  4. Associating current knowledge with that of past experience based on knowledge about automata

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, E C

    1982-01-01

    Important to the performance of interactive systems is the ability of its members to associate current knowledge with knowledge of past experience. Knowledge association results in greater detail of a current knowledge and is demonstrated through the use of examples. It is based on knowledge about automata and the knowledge structures are in the form of graphs. 11 references.

  5. Mlifdect: Android Malware Detection Based on Parallel Machine Learning and Information Fusion

    Directory of Open Access Journals (Sweden)

    Xin Wang

    2017-01-01

    Full Text Available In recent years, Android malware has continued to grow at an alarming rate. More recent malicious apps’ employing highly sophisticated detection avoidance techniques makes the traditional machine learning based malware detection methods far less effective. More specifically, they cannot cope with various types of Android malware and have limitation in detection by utilizing a single classification algorithm. To address this limitation, we propose a novel approach in this paper that leverages parallel machine learning and information fusion techniques for better Android malware detection, which is named Mlifdect. To implement this approach, we first extract eight types of features from static analysis on Android apps and build two kinds of feature sets after feature selection. Then, a parallel machine learning detection model is developed for speeding up the process of classification. Finally, we investigate the probability analysis based and Dempster-Shafer theory based information fusion approaches which can effectively obtain the detection results. To validate our method, other state-of-the-art detection works are selected for comparison with real-world Android apps. The experimental results demonstrate that Mlifdect is capable of achieving higher detection accuracy as well as a remarkable run-time efficiency compared to the existing malware detection solutions.

  6. Real-time wavelet-based inline banknote-in-bundle counting for cut-and-bundle machines

    Science.gov (United States)

    Petker, Denis; Lohweg, Volker; Gillich, Eugen; Türke, Thomas; Willeke, Harald; Lochmüller, Jens; Schaede, Johannes

    2011-03-01

    Automatic banknote sheet cut-and-bundle machines are widely used within the scope of banknote production. Beside the cutting-and-bundling, which is a mature technology, image-processing-based quality inspection for this type of machine is attractive. We present in this work a new real-time Touchless Counting and perspective cutting blade quality insurance system, based on a Color-CCD-Camera and a dual-core Computer, for cut-and-bundle applications in banknote production. The system, which applies Wavelet-based multi-scale filtering is able to count banknotes inside a 100-bundle within 200-300 ms depending on the window size.

  7. SU-F-T-352: Development of a Knowledge Based Automatic Lung IMRT Planning Algorithm with Non-Coplanar Beams

    International Nuclear Information System (INIS)

    Zhu, W; Wu, Q; Yuan, L

    2016-01-01

    Purpose: To improve the robustness of a knowledge based automatic lung IMRT planning method and to further validate the reliability of this algorithm by utilizing for the planning of clinical cases with non-coplanar beams. Methods: A lung IMRT planning method which automatically determines both plan optimization objectives and beam configurations with non-coplanar beams has been reported previously. A beam efficiency index map is constructed to guide beam angle selection in this algorithm. This index takes into account both the dose contributions from individual beams and the combined effect of multiple beams which is represented by a beam separation score. We studied the effect of this beam separation score on plan quality and determined the optimal weight for this score.14 clinical plans were re-planned with the knowledge-based algorithm. Significant dosimetric metrics for the PTV and OARs in the automatic plans are compared with those in the clinical plans by the two-sample t-test. In addition, a composite dosimetric quality index was defined to obtain the relationship between the plan quality and the beam separation score. Results: On average, we observed more than 15% reduction on conformity index and homogeneity index for PTV and V_4_0, V_6_0 for heart while an 8% and 3% increase on V_5, V_2_0 for lungs, respectively. The variation curve of the composite index as a function of angle spread score shows that 0.6 is the best value for the weight of the beam separation score. Conclusion: Optimal value for beam angle spread score in automatic lung IMRT planning is obtained. With this value, model can result in statistically the “best” achievable plans. This method can potentially improve the quality and planning efficiency for IMRT plans with no-coplanar angles.

  8. A multi-label learning based kernel automatic recommendation method for support vector machine.

    Science.gov (United States)

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance.

  9. Design of on-power fuelling machines

    International Nuclear Information System (INIS)

    Jackson, W.H.

    2004-01-01

    In May 1957, CGE was asked to design a fuelling machine for NPD2 Reactor. Two fuelling machines were required, one at each end of the reactor, that could either push the fuel bundles through the reactor or accept the bundles being pushed out. The machines had to connect on to the end fittings of the same tube, seal, fill with heavy water and pressure up to 1000 psi without external leaks. Each machine had to remove the tube seal plug from its end fitting and store it in an indexing magazine, which also had to hold up to six fuel bundles, or retrieve that many, if the magazine was empty. There was also the provision to store a spare plug. When finished moving fuel bundles, the tube plugs were to be replaced and tested for leaks, before the fuelling machines would be detached from the end fittings. This was all to be done by remote control. By late September 1957, sufficient design features were on paper and CGE management made a presentation to AECL at Chalk River Laboratories and this proposal was later accepted

  10. Long-Term Precipitation Analysis and Estimation of Precipitation Concentration Index Using Three Support Vector Machine Methods

    Directory of Open Access Journals (Sweden)

    Milan Gocic

    2016-01-01

    Full Text Available The monthly precipitation data from 29 stations in Serbia during the period of 1946–2012 were considered. Precipitation trends were calculated using linear regression method. Three CLINO periods (1961–1990, 1971–2000, and 1981–2010 in three subregions were analysed. The CLINO 1981–2010 period had a significant increasing trend. Spatial pattern of the precipitation concentration index (PCI was presented. For the purpose of PCI prediction, three Support Vector Machine (SVM models, namely, SVM coupled with the discrete wavelet transform (SVM-Wavelet, the firefly algorithm (SVM-FFA, and using the radial basis function (SVM-RBF, were developed and used. The estimation and prediction results of these models were compared with each other using three statistical indicators, that is, root mean square error, coefficient of determination, and coefficient of efficiency. The experimental results showed that an improvement in predictive accuracy and capability of generalization can be achieved by the SVM-Wavelet approach. Moreover, the results indicated the proposed SVM-Wavelet model can adequately predict the PCI.

  11. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    Science.gov (United States)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  12. A Cooperative Approach to Virtual Machine Based Fault Injection

    Energy Technology Data Exchange (ETDEWEB)

    Naughton III, Thomas J [ORNL; Engelmann, Christian [ORNL; Vallee, Geoffroy R [ORNL; Aderholdt, William Ferrol [ORNL; Scott, Stephen L [Tennessee Technological University (TTU)

    2017-01-01

    Resilience investigations often employ fault injection (FI) tools to study the effects of simulated errors on a target system. It is important to keep the target system under test (SUT) isolated from the controlling environment in order to maintain control of the experiement. Virtual machines (VMs) have been used to aid these investigations due to the strong isolation properties of system-level virtualization. A key challenge in fault injection tools is to gain proper insight and context about the SUT. In VM-based FI tools, this challenge of target con- text is increased due to the separation between host and guest (VM). We discuss an approach to VM-based FI that leverages virtual machine introspection (VMI) methods to gain insight into the target s context running within the VM. The key to this environment is the ability to provide basic information to the FI system that can be used to create a map of the target environment. We describe a proof- of-concept implementation and a demonstration of its use to introduce simulated soft errors into an iterative solver benchmark running in user-space of a guest VM.

  13. Research on intrusion detection based on Kohonen network and support vector machine

    Science.gov (United States)

    Shuai, Chunyan; Yang, Hengcheng; Gong, Zeweiyi

    2018-05-01

    In view of the problem of low detection accuracy and the long detection time of support vector machine, which directly applied to the network intrusion detection system. Optimization of SVM parameters can greatly improve the detection accuracy, but it can not be applied to high-speed network because of the long detection time. a method based on Kohonen neural network feature selection is proposed to reduce the optimization time of support vector machine parameters. Firstly, this paper is to calculate the weights of the KDD99 network intrusion data by Kohonen network and select feature by weight. Then, after the feature selection is completed, genetic algorithm (GA) and grid search method are used for parameter optimization to find the appropriate parameters and classify them by support vector machines. By comparing experiments, it is concluded that feature selection can reduce the time of parameter optimization, which has little influence on the accuracy of classification. The experiments suggest that the support vector machine can be used in the network intrusion detection system and reduce the missing rate.

  14. New concept of electrical drives for paper and board machines based on energy efficiency principles

    Directory of Open Access Journals (Sweden)

    Jeftenić Borislav

    2006-01-01

    Full Text Available In this paper, it is described how the reconstruction of the facility of paper machine has been conducted, at the press and drying part of the machine in June 2001, as well as the expansion of the Paper Machine with the "third coating" introducing, that has been done in July 2002, in the board factory "Umka". The existing old drive of the press and the drive of drying groups were established as a Line Shaft Drive, 76 m long. The novel drive is developed on the basis of conventional squirrel cage induction motor application, with frequency converter. The system control is carried out with the programmable controller, and the communication between controllers, converters, and control boards is accomplished trough profibus. Reconstruction of the coating part of the machine, during technological reconstruction of this part of the machine, was being conducted with a purpose to improve performance of the machine by adding device for spreading "third coating". The demands for the power facility were to replace existing facility with the new one, based on energy efficiency principles and to provide adequate facility for new technological sections. Also, new part of the facility had to be connected with the remaining part of the machine, i.e. with the press and drying part, which have been reconstructed in 2001. It has to be stressed that energy efficiency principles means to realize new, modernized drive with better performances and greater capacity for the as small as possible amount of increased installed power of separate drives. In the paper are also, graphically presented achieved energy savings results, based on measurements performed on separate parts of paper machine, before and after reconstruction. .

  15. A Collaborative Knowledge Plane for Autonomic Networks

    Science.gov (United States)

    Mbaye, Maïssa; Krief, Francine

    Autonomic networking aims to give network components self-managing capabilities. Several autonomic architectures have been proposed. Each of these architectures includes sort of a knowledge plane which is very important to mimic an autonomic behavior. Knowledge plane has a central role for self-functions by providing suitable knowledge to equipment and needs to learn new strategies for more accuracy.However, defining knowledge plane's architecture is still a challenge for researchers. Specially, defining the way cognitive supports interact each other in knowledge plane and implementing them. Decision making process depends on these interactions between reasoning and learning parts of knowledge plane. In this paper we propose a knowledge plane's architecture based on machine learning (inductive logic programming) paradigm and situated view to deal with distributed environment. This architecture is focused on two self-functions that include all other self-functions: self-adaptation and self-organization. Study cases are given and implemented.

  16. A human-machine cooperation route planning method based on improved A* algorithm

    Science.gov (United States)

    Zhang, Zhengsheng; Cai, Chao

    2011-12-01

    To avoid the limitation of common route planning method to blindly pursue higher Machine Intelligence and autoimmunization, this paper presents a human-machine cooperation route planning method. The proposed method includes a new A* path searing strategy based on dynamic heuristic searching and a human cooperated decision strategy to prune searching area. It can overcome the shortage of A* algorithm to fall into a local long term searching. Experiments showed that this method can quickly plan a feasible route to meet the macro-policy thinking.

  17. Association test based on SNP set: logistic kernel machine based test vs. principal component analysis.

    Directory of Open Access Journals (Sweden)

    Yang Zhao

    Full Text Available GWAS has facilitated greatly the discovery of risk SNPs associated with complex diseases. Traditional methods analyze SNP individually and are limited by low power and reproducibility since correction for multiple comparisons is necessary. Several methods have been proposed based on grouping SNPs into SNP sets using biological knowledge and/or genomic features. In this article, we compare the linear kernel machine based test (LKM and principal components analysis based approach (PCA using simulated datasets under the scenarios of 0 to 3 causal SNPs, as well as simple and complex linkage disequilibrium (LD structures of the simulated regions. Our simulation study demonstrates that both LKM and PCA can control the type I error at the significance level of 0.05. If the causal SNP is in strong LD with the genotyped SNPs, both the PCA with a small number of principal components (PCs and the LKM with kernel of linear or identical-by-state function are valid tests. However, if the LD structure is complex, such as several LD blocks in the SNP set, or when the causal SNP is not in the LD block in which most of the genotyped SNPs reside, more PCs should be included to capture the information of the causal SNP. Simulation studies also demonstrate the ability of LKM and PCA to combine information from multiple causal SNPs and to provide increased power over individual SNP analysis. We also apply LKM and PCA to analyze two SNP sets extracted from an actual GWAS dataset on non-small cell lung cancer.

  18. Spatial Frequency Multiplexing of Fiber-Optic Interferometric Refractive Index Sensors Based on Graded-Index Multimode Fibers

    Science.gov (United States)

    Liu, Li; Gong, Yuan; Wu, Yu; Zhao, Tian; Wu, Hui-Juan; Rao, Yun-Jiang

    2012-01-01

    Fiber-optic interferometric sensors based on graded-index multimode fibers have very high refractive-index sensitivity, as we previously demonstrated. In this paper, spatial-frequency multiplexing of this type of fiber-optic refractive index sensors is investigated. It is estimated that multiplexing of more than 10 such sensors is possible. In the multiplexing scheme, one of the sensors is used to investigate the refractive index and temperature responses. The fast Fourier transform (FFT) of the combined reflective spectra is analyzed. The intensity of the FFT spectra is linearly related with the refractive index and is not sensitive to the temperature.

  19. An Automatic Assembling System for Sealing Rings Based on Machine Vision

    Directory of Open Access Journals (Sweden)

    Mingyu Gao

    2017-01-01

    Full Text Available In order to grab and place the sealing rings of battery lid quickly and accurately, an automatic assembling system for sealing rings based on machine vision is developed in this paper. The whole system is composed of the light sources, cameras, industrial control units, and a 4-degree-of-freedom industrial robot. Specifically, the sealing rings are recognized and located automatically with the machine vision module. Then industrial robot is controlled for grabbing the sealing rings dynamically under the joint work of multiple control units and visual feedback. Furthermore, the coordinates of the fast-moving battery lid are tracked by the machine vision module. Finally the sealing rings are placed on the sealing ports of battery lid accurately and automatically. Experimental results demonstrate that the proposed system can grab the sealing rings and place them on the sealing port of the fast-moving battery lid successfully. More importantly, the proposed system can improve the efficiency of the battery production line obviously.

  20. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    Science.gov (United States)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  1. Knowledge based economy in European Union

    Directory of Open Access Journals (Sweden)

    Ecaterina Stănculescu

    2012-04-01

    Full Text Available Nowadays we assist at a fundamental change from the economy based mainly on resources to the one based mostly on knowledge. The concept has been launched in the last decade of the past century. The knowledge became a production agent and a value creation instrument for whatever country and of course for an entire community like European Union which is constantly concerned by its development and competitiveness. This paper presents the principal characteristics of the present EU preoccupations with the expansion of a knowledge based economy through the 2020 European Development Strategy for smart, sustainable and inclusive economy, and especially for the Framework Programs (Framework Programme 7 and Competitiveness and Innovation Framework Programme.

  2. Gaussian processes for machine learning.

    Science.gov (United States)

    Seeger, Matthias

    2004-04-01

    Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to infinite (countably or continuous) index sets. GPs have been applied in a large number of fields to a diverse range of ends, and very many deep theoretical analyses of various properties are available. This paper gives an introduction to Gaussian processes on a fairly elementary level with special emphasis on characteristics relevant in machine learning. It draws explicit connections to branches such as spline smoothing models and support vector machines in which similar ideas have been investigated. Gaussian process models are routinely used to solve hard machine learning problems. They are attractive because of their flexible non-parametric nature and computational simplicity. Treated within a Bayesian framework, very powerful statistical methods can be implemented which offer valid estimates of uncertainties in our predictions and generic model selection procedures cast as nonlinear optimization problems. Their main drawback of heavy computational scaling has recently been alleviated by the introduction of generic sparse approximations.13,78,31 The mathematical literature on GPs is large and often uses deep concepts which are not required to fully understand most machine learning applications. In this tutorial paper, we aim to present characteristics of GPs relevant to machine learning and to show up precise connections to other "kernel machines" popular in the community. Our focus is on a simple presentation, but references to more detailed sources are provided.

  3. Generating Big Data Sets from Knowledge-based Decision Support Systems to Pursue Value-based Healthcare

    Directory of Open Access Journals (Sweden)

    Arturo González-Ferrer

    2018-03-01

    Full Text Available Talking about Big Data in healthcare we usually refer to how to use data collected from current electronic medical records, either structured or unstructured, to answer clinically relevant questions. This operation is typically carried out by means of analytics tools (e.g. machine learning or by extracting relevant data from patient summaries through natural language processing techniques. From other perspective of research in medical informatics, powerful initiatives have emerged to help physicians taking decisions, in both diagnostics and therapeutics, built from the existing medical evidence (i.e. knowledge-based decision support systems. Much of the problems these tools have shown, when used in real clinical settings, are related to their implementation and deployment, more than failing in its support, but, technology is slowly overcoming interoperability and integration issues. Beyond the point-of-care decision support these tools can provide, the data generated when using them, even in controlled trials, could be used to further analyze facts that are traditionally ignored in the current clinical practice. In this paper, we reflect on the technologies available to make the leap and how they could help driving healthcare organizations shifting to a value-based healthcare philosophy.

  4. SUPPORT VECTOR MACHINE CLASSIFICATION OF OBJECT-BASED DATA FOR CROP MAPPING, USING MULTI-TEMPORAL LANDSAT IMAGERY

    Directory of Open Access Journals (Sweden)

    R. Devadas

    2012-07-01

    Full Text Available Crop mapping and time series analysis of agronomic cycles are critical for monitoring land use and land management practices, and analysing the issues of agro-environmental impacts and climate change. Multi-temporal Landsat data can be used to analyse decadal changes in cropping patterns at field level, owing to its medium spatial resolution and historical availability. This study attempts to develop robust remote sensing techniques, applicable across a large geographic extent, for state-wide mapping of cropping history in Queensland, Australia. In this context, traditional pixel-based classification was analysed in comparison with image object-based classification using advanced supervised machine-learning algorithms such as Support Vector Machine (SVM. For the Darling Downs region of southern Queensland we gathered a set of Landsat TM images from the 2010–2011 cropping season. Landsat data, along with the vegetation index images, were subjected to multiresolution segmentation to obtain polygon objects. Object-based methods enabled the analysis of aggregated sets of pixels, and exploited shape-related and textural variation, as well as spectral characteristics. SVM models were chosen after examining three shape-based parameters, twenty-three textural parameters and ten spectral parameters of the objects. We found that the object-based methods were superior to the pixel-based methods for classifying 4 major landuse/land cover classes, considering the complexities of within field spectral heterogeneity and spectral mixing. Comparative analysis clearly revealed that higher overall classification accuracy (95% was observed in the object-based SVM compared with that of traditional pixel-based classification (89% using maximum likelihood classifier (MLC. Object-based classification also resulted speckle-free images. Further, object-based SVM models were used to classify different broadacre crop types for summer and winter seasons. The influence of

  5. Methods, systems and apparatus for controlling operation of two alternating current (AC) machines

    Science.gov (United States)

    Gallegos-Lopez, Gabriel [Torrance, CA; Nagashima, James M [Cerritos, CA; Perisic, Milun [Torrance, CA; Hiti, Silva [Redondo Beach, CA

    2012-02-14

    A system is provided for controlling two AC machines. The system comprises a DC input voltage source that provides a DC input voltage, a voltage boost command control module (VBCCM), a five-phase PWM inverter module coupled to the two AC machines, and a boost converter coupled to the inverter module and the DC input voltage source. The boost converter is designed to supply a new DC input voltage to the inverter module having a value that is greater than or equal to a value of the DC input voltage. The VBCCM generates a boost command signal (BCS) based on modulation indexes from the two AC machines. The BCS controls the boost converter such that the boost converter generates the new DC input voltage in response to the BCS. When the two AC machines require additional voltage that exceeds the DC input voltage required to meet a combined target mechanical power required by the two AC machines, the BCS controls the boost converter to drive the new DC input voltage generated by the boost converter to a value greater than the DC input voltage.

  6. Duality-based algorithms for scheduling on unrelated parallel machines

    NARCIS (Netherlands)

    van de Velde, S.L.; van de Velde, S.L.

    1993-01-01

    We consider the following parallel machine scheduling problem. Each of n independent jobs has to be scheduled on one of m unrelated parallel machines. The processing of job J[sub l] on machine Mi requires an uninterrupted period of positive length p[sub lj]. The objective is to find an assignment of

  7. A knowledge base browser using hypermedia

    Science.gov (United States)

    Pocklington, Tony; Wang, Lui

    1990-01-01

    A hypermedia system is being developed to browse CLIPS (C Language Integrated Production System) knowledge bases. This system will be used to help train flight controllers for the Mission Control Center. Browsing this knowledge base will be accomplished either by having navigating through the various collection nodes that have already been defined, or through the query languages.

  8. Prototype-based models in machine learning.

    Science.gov (United States)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of potentially high-dimensional, complex datasets. We discuss basic schemes of competitive vector quantization as well as the so-called neural gas approach and Kohonen's topology-preserving self-organizing map. Supervised learning in prototype systems is exemplified in terms of learning vector quantization. Most frequently, the familiar Euclidean distance serves as a dissimilarity measure. We present extensions of the framework to nonstandard measures and give an introduction to the use of adaptive distances in relevance learning. © 2016 Wiley Periodicals, Inc.

  9. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  10. Anodic solubility and electrochemical machining of hard alloys on the base of chromium and titanium carbides

    Energy Technology Data Exchange (ETDEWEB)

    Davydov, A D; Klepikov, A N; Malofeeva, A N; Moroz, I I

    1985-01-01

    The regularities of anodic behaviour and electrochemical machining (ECM) of the samples of three materials with the following compositions: 25% of Cr/sub 3/C/sub 2/, 15% of Ni, 70% of TiC, 25% of Ni, 5% of Cr, 70% of TiC, 15% of Ni, 15% of Mo are investigated. It is shown that the electrochemical method is applicable to hard alloys machining on the base of chromium and titanium carbides, the machining of which mechanically meets serious difficulties. The alloys machining rate by a mobile cathode constitutes about 0.5 mm/min.

  11. Slope Deformation Prediction Based on Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei JIA

    2013-07-01

    Full Text Available This paper principally studies the prediction of slope deformation based on Support Vector Machine (SVM. In the prediction process,explore how to reconstruct the phase space. The geological body’s displacement data obtained from chaotic time series are used as SVM’s training samples. Slope displacement caused by multivariable coupling is predicted by means of single variable. Results show that this model is of high fitting accuracy and generalization, and provides reference for deformation prediction in slope engineering.

  12. Study of on-machine error identification and compensation methods for micro machine tools

    International Nuclear Information System (INIS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-01-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  13. A Tailored Ontology Supporting Sensor Implementation for the Maintenance of Industrial Machines.

    Science.gov (United States)

    Maleki, Elaheh; Belkadi, Farouk; Ritou, Mathieu; Bernard, Alain

    2017-09-08

    The longtime productivity of an industrial machine is improved by condition-based maintenance strategies. To do this, the integration of sensors and other cyber-physical devices is necessary in order to capture and analyze a machine's condition through its lifespan. Thus, choosing the best sensor is a critical step to ensure the efficiency of the maintenance process. Indeed, considering the variety of sensors, and their features and performance, a formal classification of a sensor's domain knowledge is crucial. This classification facilitates the search for and reuse of solutions during the design of a new maintenance service. Following a Knowledge Management methodology, the paper proposes and develops a new sensor ontology that structures the domain knowledge, covering both theoretical and experimental sensor attributes. An industrial case study is conducted to validate the proposed ontology and to demonstrate its utility as a guideline to ease the search of suitable sensors. Based on the ontology, the final solution will be implemented in a shared repository connected to legacy CAD (computer-aided design) systems. The selection of the best sensor is, firstly, obtained by the matching of application requirements and sensor specifications (that are proposed by this sensor repository). Then, it is refined from the experimentation results. The achieved solution is recorded in the sensor repository for future reuse. As a result, the time and cost of the design process of new condition-based maintenance services is reduced.

  14. A Machine Learning-based Rainfall System for GPM Dual-frequency Radar

    Science.gov (United States)

    Tan, H.; Chandrasekar, V.; Chen, H.

    2017-12-01

    Precipitation measurement produced by the Global Precipitation Measurement (GPM) Dual-frequency Precipitation Radar (DPR) plays an important role in researching the water circle and forecasting extreme weather event. Compare with its predecessor - Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), GRM DPR measures precipitation in two different frequencies (i.e., Ku and Ka band), which can provide detailed information on the microphysical properties of precipitation particles, quantify particle size distribution and quantitatively measure light rain and falling snow. This paper presents a novel Machine Learning system for ground-based and space borne radar rainfall estimation. The system first trains ground radar data for rainfall estimation using rainfall measurements from gauges and subsequently uses the ground radar based rainfall estimates to train GPM DPR data in order to get space based rainfall product. Therein, data alignment between space DPR and ground radar is conducted using the methodology proposed by Bolen and Chandrasekar (2013), which can minimize the effects of potential geometric distortion of GPM DPR observations. For demonstration purposes, rainfall measurements from three rain gauge networks near Melbourne, Florida, are used for training and validation purposes. These three gauge networks, which are located in Kennedy Space Center (KSC), South Florida Water Management District (SFL), and St. Johns Water Management District (STJ), include 33, 46, and 99 rain gauge stations, respectively. Collocated ground radar observations from the National Weather Service (NWS) Weather Surveillance Radar - 1988 Doppler (WSR-88D) in Melbourne (i.e., KMLB radar) are trained with the gauge measurements. The trained model is then used to derive KMLB radar based rainfall product, which is used to train GPM DPR data collected from coincident overpasses events. The machine learning based rainfall product is compared against the GPM standard products

  15. Downscaling of MODIS One Kilometer Evapotranspiration Using Landsat-8 Data and Machine Learning Approaches

    Directory of Open Access Journals (Sweden)

    Yinghai Ke

    2016-03-01

    Full Text Available This study presented a MODIS 8-day 1 km evapotranspiration (ET downscaling method based on Landsat 8 data (30 m and machine learning approaches. Eleven indicators including albedo, land surface temperature (LST, and vegetation indices (VIs derived from Landsat 8 data were first upscaled to 1 km resolution. Machine learning algorithms including Support Vector Regression (SVR, Cubist, and Random Forest (RF were used to model the relationship between the Landsat indicators and MODIS 8-day 1 km ET. The models were then used to predict 30 m ET based on Landsat 8 indicators. A total of thirty-two pairs of Landsat 8 images/MODIS ET data were evaluated at four study sites including two in United States and two in South Korea. Among the three models, RF produced the lowest error, with relative Root Mean Square Error (rRMSE less than 20%. Vegetation greenness related indicators such as Normalized Difference Vegetation Index (NDVI, Enhanced Vegetation Index (EVI, Soil Adjusted Vegetation Index (SAVI, and vegetation moisture related indicators such as Normalized Difference Infrared Index—Landsat 8 OLI band 7 (NDIIb7 and Normalized Difference Water Index (NDWI were the five most important features used in RF model. Temperature-based indicators were less important than vegetation greenness and moisture-related indicators because LST could have considerable variation during each 8-day period. The predicted Landsat downscaled ET had good overall agreement with MODIS ET (average rRMSE = 22% and showed a similar temporal trend as MODIS ET. Compared to the MODIS ET product, the downscaled product demonstrated more spatial details, and had better agreement with in situ ET observations (R2 = 0.56. However, we found that the accuracy of MODIS ET was the main control factor of the accuracy of the downscaled product. Improved coarse-resolution ET estimation would result in better finer-resolution estimation. This study proved the potential of using machine learning

  16. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  17. Fault Diagnosis for Engine Based on Single-Stage Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Fei Gao

    2016-01-01

    Full Text Available Single-Stage Extreme Learning Machine (SS-ELM is presented to dispose of the mechanical fault diagnosis in this paper. Based on it, the traditional mapping type of extreme learning machine (ELM has been changed and the eigenvectors extracted from signal processing methods are directly regarded as outputs of the network’s hidden layer. Then the uncertainty that training data transformed from the input space to the ELM feature space with the ELM mapping and problem of the selection of the hidden nodes are avoided effectively. The experiment results of diesel engine fault diagnosis show good performance of the SS-ELM algorithm.

  18. Optimization of machining parameters of turning operations based on multi performance criteria

    Directory of Open Access Journals (Sweden)

    N.K.Mandal

    2013-01-01

    Full Text Available The selection of optimum machining parameters plays a significant role to ensure quality of product, to reduce the manufacturing cost and to increase productivity in computer controlled manufacturing process. For many years, multi-objective optimization of turning based on inherent complexity of process is a competitive engineering issue. This study investigates multi-response optimization of turning process for an optimal parametric combination to yield the minimum power consumption, surface roughness and frequency of tool vibration using a combination of a Grey relational analysis (GRA. Confirmation test is conducted for the optimal machining parameters to validate the test result. Various turning parameters, such as spindle speed, feed and depth of cut are considered. Experiments are designed and conducted based on full factorial design of experiment.

  19. Machine Learning and Radiology

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M.

    2012-01-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. PMID:22465077

  20. A Comparison Study of Machine Learning Based Algorithms for Fatigue Crack Growth Calculation.

    Science.gov (United States)

    Wang, Hongxun; Zhang, Weifang; Sun, Fuqiang; Zhang, Wei

    2017-05-18

    The relationships between the fatigue crack growth rate ( d a / d N ) and stress intensity factor range ( Δ K ) are not always linear even in the Paris region. The stress ratio effects on fatigue crack growth rate are diverse in different materials. However, most existing fatigue crack growth models cannot handle these nonlinearities appropriately. The machine learning method provides a flexible approach to the modeling of fatigue crack growth because of its excellent nonlinear approximation and multivariable learning ability. In this paper, a fatigue crack growth calculation method is proposed based on three different machine learning algorithms (MLAs): extreme learning machine (ELM), radial basis function network (RBFN) and genetic algorithms optimized back propagation network (GABP). The MLA based method is validated using testing data of different materials. The three MLAs are compared with each other as well as the classical two-parameter model ( K * approach). The results show that the predictions of MLAs are superior to those of K * approach in accuracy and effectiveness, and the ELM based algorithms show overall the best agreement with the experimental data out of the three MLAs, for its global optimization and extrapolation ability.

  1. Integrating machine learning and physician knowledge to improve the accuracy of breast biopsy.

    Science.gov (United States)

    Dutra, I; Nassif, H; Page, D; Shavlik, J; Strigel, R M; Wu, Y; Elezaby, M E; Burnside, E

    2011-01-01

    In this work we show that combining physician rules and machine learned rules may improve the performance of a classifier that predicts whether a breast cancer is missed on percutaneous, image-guided breast core needle biopsy (subsequently referred to as "breast core biopsy"). Specifically, we show how advice in the form of logical rules, derived by a sub-specialty, i.e. fellowship trained breast radiologists (subsequently referred to as "our physicians") can guide the search in an inductive logic programming system, and improve the performance of a learned classifier. Our dataset of 890 consecutive benign breast core biopsy results along with corresponding mammographic findings contains 94 cases that were deemed non-definitive by a multidisciplinary panel of physicians, from which 15 were upgraded to malignant disease at surgery. Our goal is to predict upgrade prospectively and avoid surgery in women who do not have breast cancer. Our results, some of which trended toward significance, show evidence that inductive logic programming may produce better results for this task than traditional propositional algorithms with default parameters. Moreover, we show that adding knowledge from our physicians into the learning process may improve the performance of the learned classifier trained only on data.

  2. Project-based knowledge in organizing open innovation

    CERN Document Server

    Comacchio, Anna; Pizzi, Claudio

    2014-01-01

    Enriching understanding of the current theoretical debate on project-based open innovation, ‘Project-based Knowledge in Organizing Open Innovation’ draws on innovation management literature and knowledge-based perspectives to investigate the relationship between knowledge development at project level and the strategic organization of open innovation. Addressing the still open issue of how the firm level should be complemented by studies at the project level of analysis, this book provides theoretical and empirical arguments on the advantages of a more fine-grained level of analysis to understand how firms organize their innovation processes across boundaries. The book also addresses the emerging interest in the management literature on project-based organizations, and on the relevance of project forms of organizing in a knowledge-based economy. Through field research in different industrial settings , this book provides empirical evidence on how firms design open innovation project-by-project and it will ...

  3. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  4. Coronary heart disease index based on longitudinal electrocardiography

    Science.gov (United States)

    Townsend, J. C.; Cronin, J. P.

    1977-01-01

    A coronary heart disease index was developed from longitudinal ECG (LCG) tracings to serve as a cardiac health measure in studies of working and, essentially, asymptomatic populations, such as pilots and executives. For a given subject, the index consisted of a composite score based on the presence of LCG aberrations and weighted values previously assigned to them. The index was validated by correlating it with the known presence or absence of CHD as determined by a complete physical examination, including treadmill, resting ECG, and risk factor information. The validating sample consisted of 111 subjects drawn by a stratified-random procedure from 5000 available case histories. The CHD index was found to be significantly more valid as a sole indicator of CHD than the LCG without the use of the index. The index consistently produced higher validity coefficients in identifying CHD than did treadmill testing, resting ECG, or risk factor analysis.

  5. Knowledge-based diagnosis for aerospace systems

    Science.gov (United States)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  6. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  7. Novel indexes based on network structure to indicate financial market

    Science.gov (United States)

    Zhong, Tao; Peng, Qinke; Wang, Xiao; Zhang, Jing

    2016-02-01

    There have been various achievements to understand and to analyze the financial market by complex network model. However, current studies analyze the financial network model but seldom present quantified indexes to indicate or forecast the price action of market. In this paper, the stock market is modeled as a dynamic network, in which the vertices refer to listed companies and edges refer to their rank-based correlation based on price series. Characteristics of the network are analyzed and then novel indexes are introduced into market analysis, which are calculated from maximum and fully-connected subnets. The indexes are compared with existing ones and the results confirm that our indexes perform better to indicate the daily trend of market composite index in advance. Via investment simulation, the performance of our indexes is analyzed in detail. The results indicate that the dynamic complex network model could not only serve as a structural description of the financial market, but also work to predict the market and guide investment by indexes.

  8. Neuro-symbolic representation learning on biological knowledge graphs.

    Science.gov (United States)

    Alshahrani, Mona; Khan, Mohammad Asif; Maddouri, Omar; Kinjo, Akira R; Queralt-Rosinach, Núria; Hoehndorf, Robert

    2017-09-01

    Biological data and knowledge bases increasingly rely on Semantic Web technologies and the use of knowledge graphs for data integration, retrieval and federated queries. In the past years, feature learning methods that are applicable to graph-structured data are becoming available, but have not yet widely been applied and evaluated on structured biological knowledge. Results: We develop a novel method for feature learning on biological knowledge graphs. Our method combines symbolic methods, in particular knowledge representation using symbolic logic and automated reasoning, with neural networks to generate embeddings of nodes that encode for related information within knowledge graphs. Through the use of symbolic logic, these embeddings contain both explicit and implicit information. We apply these embeddings to the prediction of edges in the knowledge graph representing problems of function prediction, finding candidate genes of diseases, protein-protein interactions, or drug target relations, and demonstrate performance that matches and sometimes outperforms traditional approaches based on manually crafted features. Our method can be applied to any biological knowledge graph, and will thereby open up the increasing amount of Semantic Web based knowledge bases in biology to use in machine learning and data analytics. https://github.com/bio-ontology-research-group/walking-rdf-and-owl. robert.hoehndorf@kaust.edu.sa. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  9. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...... by the apparent shift being consistent with one of a number of numerical possibilities for the real shift which differ by 2n are resolved by combining measurements performed on the same sample using light paths therethrough of differing lengths....

  10. Knowledge-Based Economy in Argentina, Costa Rica and Mexico: A Comparative Analysis from the Bio-Economy Perspective

    Directory of Open Access Journals (Sweden)

    Ana Barbara MUNGARAY-MOCTEZUMA

    2015-06-01

    Full Text Available The objective of this article is to determine the necessary institutional characteristics of technology and human capital in Argentina, Costa Rica and Mexico in order to evolve towards a knowledge-based economy, addressing the importance of institutions for their development. In particular, the knowledge-based economy is analyzed from the perspective of bioeconomics. Based on the Knowledge Economy Index (KEI which considers 148 indicators, in the following categories: a economic performance and institutional regime; b education and human resources, c innovation, and d information and communication technologies, we selected 13 indicators. We aim to identify the strengths and opportunities for these countries in order to meet the challenges that arise from the paradoxes of technological progress and globalization. In this sense, bioeconomy is approached as part of the economy. This analysis shows, among other things, that Argentina has greater potential to compete in an economy sustained in the creation and dissemination of knowledge, while Costa Rica has an institutional and regulatory environment that is more conducive to the development of business activities, and Mexico faces significant challenges regarding its institutional structure, economic performance and human resources.

  11. Stator fault detection for multi-phase machines with multiple reference frames transformation

    DEFF Research Database (Denmark)

    Bianchini, Claudio; Fornasiero, Emanuele; Matzen, T.N.

    2009-01-01

    The paper focuses on a new diagnostic index for fault detection of a five-phase permanent-magnet machine. This machine has been designed for fault tolerant applications, and it is characterized by a mutual inductance equal to zero and a high self inductance, in order to limit the short-circuit cu...

  12. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    Science.gov (United States)

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  13. The household-based socio-economic deprivation index in Setiu Wetlands, Malaysia

    Science.gov (United States)

    Zakaria, Syerrina; May, Chin Sin; Rahman, Nuzlinda Abdul

    2017-08-01

    Deprivation index usually used in public health study. At the same time, deprivation index can also use to measure the level of deprivation in an area or a village. These indices are also referred as the index of inequalities or disadvantage. Even though, there are many indices that have been built before. But it is believed to be less appropriate to use the existing indices to be applied in other countries or areas which had different socio-economic conditions and different geographical characteristics. The objective of this study is to construct the index based on the socio-economic factors in Setiu Wetlands (Jajaran Merang, Jajaran Setiu and Jajaran Kuala Besut) in Terengganu Malaysia which is defined as weighted household-based socioeconomic deprivation index. This study has employed the variables based on income level, education level and employment rate obtained from questionnaire which are acquired from 64 villages included 1024 respondents. The factor analysis is used to extract the latent variables or observed variables into smaller amount of components or factors. By using factor analysis, one factor is extracted from 3 latent variables. This factor known as socioeconomic deprivation index. Based on the result, the areas with a lower index values until high index values were identified.

  14. Density Based Support Vector Machines for Classification

    OpenAIRE

    Zahra Nazari; Dongshik Kang

    2015-01-01

    Support Vector Machines (SVM) is the most successful algorithm for classification problems. SVM learns the decision boundary from two classes (for Binary Classification) of training points. However, sometimes there are some less meaningful samples amongst training points, which are corrupted by noises or misplaced in wrong side, called outliers. These outliers are affecting on margin and classification performance, and machine should better to discard them. SVM as a popular and widely used cl...

  15. The definition of insulin resistance using HOMA-IR for Americans of Mexican descent using machine learning.

    Science.gov (United States)

    Qu, Hui-Qi; Li, Quan; Rentfro, Anne R; Fisher-Hoch, Susan P; McCormick, Joseph B

    2011-01-01

    The lack of standardized reference range for the homeostasis model assessment-estimated insulin resistance (HOMA-IR) index has limited its clinical application. This study defines the reference range of HOMA-IR index in an adult Hispanic population based with machine learning methods. This study investigated a Hispanic population of 1854 adults, randomly selected on the basis of 2000 Census tract data in the city of Brownsville, Cameron County. Machine learning methods, support vector machine (SVM) and Bayesian Logistic Regression (BLR), were used to automatically identify measureable variables using standardized values that correlate with HOMA-IR; K-means clustering was then used to classify the individuals by insulin resistance. Our study showed that the best cutoff of HOMA-IR for identifying those with insulin resistance is 3.80. There are 39.1% individuals in this Hispanic population with HOMA-IR>3.80. Our results are dramatically different using the popular clinical cutoff of 2.60. The high sensitivity and specificity of HOMA-IR>3.80 for insulin resistance provide a critical fundamental for our further efforts to improve the public health of this Hispanic population.

  16. Process fault diagnosis using knowledge-based systems

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1991-01-01

    Advancing technology in process plants has led to increased need for computer based process diagnostic systems to assist the operator. One approach to this problem is to use an embedded knowledge based system to interpret measurement signals. Knowledge based systems using only symptom based rules are inadequate for real time diagnosis of dynamic systems; therefore a model based approach is necessary. Though several forms of model based reasoning have been proposed, the use of qualitative causal models incorporating first principles knowledge of process behavior structure, and function appear to have the most promise as a robust modeling methodology. In this paper the structure of a diagnostic system is described which uses model based reasoning and conventional numerical methods to perform process diagnosis. This system is being applied to emergency diesel generator system in nuclear stations

  17. Ontology Language to Support Description of Experiment Control System Semantics, Collaborative Knowledge-Base Design and Ontology Reuse

    International Nuclear Information System (INIS)

    Gyurjyan, Vardan; Abbott, D.; Heyes, G.; Jastrzembski, E.; Moffit, B.; Timmer, C.; Wolin, E.

    2009-01-01

    In this paper we discuss the control domain specific ontology that is built on top of the domain-neutral Resource Definition Framework (RDF). Specifically, we will discuss the relevant set of ontology concepts along with the relationships among them in order to describe experiment control components and generic event-based state machines. Control Oriented Ontology Language (COOL) is a meta-data modeling language that provides generic means for representation of physics experiment control processes and components, and their relationships, rules and axioms. It provides a semantic reference frame that is useful for automating the communication of information for configuration, deployment and operation. COOL has been successfully used to develop a complete and dynamic knowledge-base for experiment control systems, developed using the AFECS framework.

  18. Transductive and matched-pair machine learning for difficult target detection problems

    Science.gov (United States)

    Theiler, James

    2014-06-01

    This paper will describe the application of two non-traditional kinds of machine learning (transductive machine learning and the more recently proposed matched-pair machine learning) to the target detection problem. The approach combines explicit domain knowledge to model the target signal with a more agnostic machine-learning approach to characterize the background. The concept is illustrated with simulated data from an elliptically-contoured background distribution, on which a subpixel target of known spectral signature but unknown spatial extent has been implanted.

  19. On-machine measurement of a slow slide servo diamond-machined 3D microstructure with a curved substrate

    International Nuclear Information System (INIS)

    Zhu, Wu-Le; Yang, Shunyao; Ju, Bing-Feng; Jiang, Jiacheng; Sun, Anyu

    2015-01-01

    A scanning tunneling microscope-based multi-axis measuring system is specially developed for the on-machine measurement of three-dimensional (3D) microstructures, to address the quality control difficulty with the traditional off-line measurement process. A typical 3D microstructure of the curved compound eye was diamond-machined by the slow slide servo technique, and then the whole surface was on-machine scanned three-dimensionally based on the tip-tracking strategy by utilizing a spindle, two linear motion stages, and an additional rotary stage. The machined surface profile and its shape deviation were accurately measured on-machine. The distortion of imaged ommatidia on the curved substrate was distinctively evaluated based on the characterized points extracted from the measured surface. Furthermore, the machining errors were investigated in connection with the on-machine measured surface and its characteristic parameters. Through experiments, the proposed measurement system is demonstrated to feature versatile on-machine measurement of 3D microstructures with a curved substrate, which is highly meaningful for quality control in the fabrication field. (paper)

  20. Sine-Bar Attachment For Machine Tools

    Science.gov (United States)

    Mann, Franklin D.

    1988-01-01

    Sine-bar attachment for collets, spindles, and chucks helps machinists set up quickly for precise angular cuts that require greater precision than provided by graduations of machine tools. Machinist uses attachment to index head, carriage of milling machine or lathe relative to table or turning axis of tool. Attachment accurate to 1 minute or arc depending on length of sine bar and precision of gauge blocks in setup. Attachment installs quickly and easily on almost any type of lathe or mill. Requires no special clamps or fixtures, and eliminates many trial-and-error measurements. More stable than improvised setups and not jarred out of position readily.

  1. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    Science.gov (United States)

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  2. An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.

    Science.gov (United States)

    Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein

    2017-12-22

    The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.

  3. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  4. Man machine interface based on speech recognition

    International Nuclear Information System (INIS)

    Jorge, Carlos A.F.; Aghina, Mauricio A.C.; Mol, Antonio C.A.; Pereira, Claudio M.N.A.

    2007-01-01

    This work reports the development of a Man Machine Interface based on speech recognition. The system must recognize spoken commands, and execute the desired tasks, without manual interventions of operators. The range of applications goes from the execution of commands in an industrial plant's control room, to navigation and interaction in virtual environments. Results are reported for isolated word recognition, the isolated words corresponding to the spoken commands. For the pre-processing stage, relevant parameters are extracted from the speech signals, using the cepstral analysis technique, that are used for isolated word recognition, and corresponds to the inputs of an artificial neural network, that performs recognition tasks. (author)

  5. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15

  6. Game-powered machine learning.

    Science.gov (United States)

    Barrington, Luke; Turnbull, Douglas; Lanckriet, Gert

    2012-04-24

    Searching for relevant content in a massive amount of multimedia information is facilitated by accurately annotating each image, video, or song with a large number of relevant semantic keywords, or tags. We introduce game-powered machine learning, an integrated approach to annotating multimedia content that combines the effectiveness of human computation, through online games, with the scalability of machine learning. We investigate this framework for labeling music. First, a socially-oriented music annotation game called Herd It collects reliable music annotations based on the "wisdom of the crowds." Second, these annotated examples are used to train a supervised machine learning system. Third, the machine learning system actively directs the annotation games to collect new data that will most benefit future model iterations. Once trained, the system can automatically annotate a corpus of music much larger than what could be labeled using human computation alone. Automatically annotated songs can be retrieved based on their semantic relevance to text-based queries (e.g., "funky jazz with saxophone," "spooky electronica," etc.). Based on the results presented in this paper, we find that actively coupling annotation games with machine learning provides a reliable and scalable approach to making searchable massive amounts of multimedia data.

  7. Frame model of knowledge in quality control systems

    Energy Technology Data Exchange (ETDEWEB)

    Macherauskas, V.Yu.

    1982-09-01

    The purpose of this article is to develop a semiotic model for representation of data and knowledge in a system for supplying operational information to management personnel on the progress of a technological process, with the aim of enabling an analysis of deviations of product quality and formulation of recommendations to the technologists as to how to eliminate them. Since any knowledge of people that can be realistically utilized in machine systems is represented in natural language form, special languages for representation of knowledge, based on the concept of frames, are being developed for formation of semiotic models in computers. This article defines the frames, followed by a description of a mechanism of knowledge manipulation and of some aspects of realization of a frame model of knowledge. 9 references.

  8. Validation of ozone monitoring instrument ultraviolet index against ground-based UV index in Kampala, Uganda.

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Ssenyonga, Taddeo; Chen, Yi-Chun; Stamnes, Jakob J; Frette, Øyvind; Hamre, Børge

    2015-10-01

    The Ozone Monitoring Instrument (OMI) overpass solar ultraviolet (UV) indices have been validated against the ground-based UV indices derived from Norwegian Institute for Air Research UV measurements in Kampala (0.31° N, 32.58° E, 1200 m), Uganda for the period between 2005 and 2014. An excessive use of old cars, which would imply a high loading of absorbing aerosols, could cause the OMI retrieval algorithm to overestimate the surface UV irradiances. The UV index values were found to follow a seasonal pattern with maximum values in March and October. Under all-sky conditions, the OMI retrieval algorithm was found to overestimate the UV index values with a mean bias of about 28%. When only days with radiation modification factor greater than or equal to 65%, 70%, 75%, and 80% were considered, the mean bias between ground-based and OMI overpass UV index values was reduced to 8%, 5%, 3%, and 1%, respectively. The overestimation of the UV index by the OMI retrieval algorithm was found to be mainly due to clouds and aerosols.

  9. A carbon risk prediction model for Chinese heavy-polluting industrial enterprises based on support vector machine

    International Nuclear Information System (INIS)

    Zhou, Zhifang; Xiao, Tian; Chen, Xiaohong; Wang, Chang

    2016-01-01

    Chinese heavy-polluting industrial enterprises, especially petrochemical or chemical industry, labeled low carbon efficiency and high emission load, are facing the tremendous pressure of emission reduction under the background of global shortage of energy supply and constrain of carbon emission. However, due to the limited amount of theoretic and practical research in this field, problems like lacking prediction indicators or models, and the quantified standard of carbon risk remain unsolved. In this paper, the connotation of carbon risk and an assessment index system for Chinese heavy-polluting industrial enterprises (eg. coal enterprise, petrochemical enterprises, chemical enterprises et al.) based on support vector machine are presented. By using several heavy-polluting industrial enterprises’ related data, SVM model is trained to predict the carbon risk level of a specific enterprise, which allows the enterprise to identify and manage its carbon risks. The result shows that this method can predict enterprise’s carbon risk level in an efficient, accurate way with high practical application and generalization value.

  10. Mortality risk prediction in burn injury: Comparison of logistic regression with machine learning approaches.

    Science.gov (United States)

    Stylianou, Neophytos; Akbarov, Artur; Kontopantelis, Evangelos; Buchan, Iain; Dunn, Ken W

    2015-08-01

    Predicting mortality from burn injury has traditionally employed logistic regression models. Alternative machine learning methods have been introduced in some areas of clinical prediction as the necessary software and computational facilities have become accessible. Here we compare logistic regression and machine learning predictions of mortality from burn. An established logistic mortality model was compared to machine learning methods (artificial neural network, support vector machine, random forests and naïve Bayes) using a population-based (England & Wales) case-cohort registry. Predictive evaluation used: area under the receiver operating characteristic curve; sensitivity; specificity; positive predictive value and Youden's index. All methods had comparable discriminatory abilities, similar sensitivities, specificities and positive predictive values. Although some machine learning methods performed marginally better than logistic regression the differences were seldom statistically significant and clinically insubstantial. Random forests were marginally better for high positive predictive value and reasonable sensitivity. Neural networks yielded slightly better prediction overall. Logistic regression gives an optimal mix of performance and interpretability. The established logistic regression model of burn mortality performs well against more complex alternatives. Clinical prediction with a small set of strong, stable, independent predictors is unlikely to gain much from machine learning outside specialist research contexts. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  11. Knowledge-Basing Teaching Professions and Professional Practice

    DEFF Research Database (Denmark)

    Thingstrup, Signe Hvid

    This paper discusses the demand for knowledge-based practice and two different answers to this demand, namely evidence-based thinking and critical-political thinking. The paper discusses the implications these have for views on knowledge and professional development. The paper presents and discus...

  12. Research on bearing life prediction based on support vector machine and its application

    International Nuclear Information System (INIS)

    Sun Chuang; Zhang Zhousuo; He Zhengjia

    2011-01-01

    Life prediction of rolling element bearing is the urgent demand in engineering practice, and the effective life prediction technique is beneficial to predictive maintenance. Support vector machine (SVM) is a novel machine learning method based on statistical learning theory, and is of advantage in prediction. This paper develops SVM-based model for bearing life prediction. The inputs of the model are features of bearing vibration signal and the output is the bearing running time-bearing failure time ratio. The model is built base on a few failed bearing data, and it can fuse information of the predicted bearing. So it is of advantage to bearing life prediction in practice. The model is applied to life prediction of a bearing, and the result shows the proposed model is of high precision.

  13. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Ricardo Andres Pizarro

    2016-12-01

    Full Text Available High-resolution three-dimensional magnetic resonance imaging (3D-MRI is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM algorithm in the quality assessment of structural brain images, using global and region of interest (ROI automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  14. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm.

    Science.gov (United States)

    Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  15. Machine learning and radiology.

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M

    2012-07-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. Copyright © 2012. Published by Elsevier B.V.

  16. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  17. The strength study of the rotating device driver indexing spatial mechanism

    Science.gov (United States)

    Zakharenkov, N. V.; Kvasov, I. N.

    2018-04-01

    The indexing spatial mechanisms are widely used in automatic machines. The mechanisms maximum load-bearing capacity measurement is possible based on both the physical and numerical models tests results. The paper deals with the driven disk indexing spatial cam mechanism numerical model at the constant angular cam velocity. The presented mechanism kinematics and geometry parameters and finite element model are analyzed in the SolidWorks design environment. The calculation initial data and missing parameters having been found from the structure analysis were identified. The structure and kinematics analysis revealed the mechanism failures possible reasons. The numerical calculations results showing the structure performance at the contact and bending stresses are represented.

  18. Enriched Title-Based Keyword Index Generation Using dBase II.

    Science.gov (United States)

    Rajendran, P. P.

    1986-01-01

    Describes the use of a database management system (DBMS)--dBaseII--to create an enriched title-based keyword index for a collection of news items at the Renewable Energy Resources Information Center of the Asian Institute of Technology. The use of DBMSs in libraries in developing countries is emphasized. (Author/LRW)

  19. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  20. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…