WorldWideScience

Sample records for intelligent computer tools

  1. Intelligent cloud computing security using genetic algorithm as a computational tools

    Science.gov (United States)

    Razuky AL-Shaikhly, Mazin H.

    2018-05-01

    An essential change had occurred in the field of Information Technology which represented with cloud computing, cloud giving virtual assets by means of web yet awesome difficulties in the field of information security and security assurance. Currently main problem with cloud computing is how to improve privacy and security for cloud “cloud is critical security”. This paper attempts to solve cloud security by using intelligent system with genetic algorithm as wall to provide cloud data secure, all services provided by cloud must detect who receive and register it to create list of users (trusted or un-trusted) depend on behavior. The execution of present proposal has shown great outcome.

  2. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  3. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  4. Land Cover Classification from Multispectral Data Using Computational Intelligence Tools: A Comparative Study

    Directory of Open Access Journals (Sweden)

    André Mora

    2017-11-01

    Full Text Available This article discusses how computational intelligence techniques are applied to fuse spectral images into a higher level image of land cover distribution for remote sensing, specifically for satellite image classification. We compare a fuzzy-inference method with two other computational intelligence methods, decision trees and neural networks, using a case study of land cover classification from satellite images. Further, an unsupervised approach based on k-means clustering has been also taken into consideration for comparison. The fuzzy-inference method includes training the classifier with a fuzzy-fusion technique and then performing land cover classification using reinforcement aggregation operators. To assess the robustness of the four methods, a comparative study including three years of land cover maps for the district of Mandimba, Niassa province, Mozambique, was undertaken. Our results show that the fuzzy-fusion method performs similarly to decision trees, achieving reliable classifications; neural networks suffer from overfitting; while k-means clustering constitutes a promising technique to identify land cover types from unknown areas.

  5. Artificial intelligence-based computer modeling tools for controlling slag foaming in electric arc furnaces

    Science.gov (United States)

    Wilson, Eric Lee

    Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.

  6. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  7. Computational Intelligence in Image Processing

    CERN Document Server

    Siarry, Patrick

    2013-01-01

    Computational intelligence based techniques have firmly established themselves as viable, alternate, mathematical tools for more than a decade. They have been extensively employed in many systems and application domains, among these signal processing, automatic control, industrial and consumer electronics, robotics, finance, manufacturing systems, electric power systems, and power electronics. Image processing is also an extremely potent area which has attracted the atten­tion of many researchers who are interested in the development of new computational intelligence-based techniques and their suitable applications, in both research prob­lems and in real-world problems. Part I of the book discusses several image preprocessing algorithms; Part II broadly covers image compression algorithms; Part III demonstrates how computational intelligence-based techniques can be effectively utilized for image analysis purposes; and Part IV shows how pattern recognition, classification and clustering-based techniques can ...

  8. New trends in computational collective intelligence

    CERN Document Server

    Kim, Sang-Wook; Trawiński, Bogdan

    2015-01-01

    This book consists of 20 chapters in which the authors deal with different theoretical and practical aspects of new trends in Collective Computational Intelligence techniques. Computational Collective Intelligence methods and algorithms are one the current trending research topics from areas related to Artificial Intelligence, Soft Computing or Data Mining among others. Computational Collective Intelligence is a rapidly growing field that is most often understood as an AI sub-field dealing with soft computing methods which enable making group decisions and processing knowledge among autonomous units acting in distributed environments. Web-based Systems, Social Networks, and Multi-Agent Systems very often need these tools for working out consistent knowledge states, resolving conflicts and making decisions. The chapters included in this volume cover a selection of topics and new trends in several domains related to Collective Computational Intelligence: Language and Knowledge Processing, Data Mining Methods an...

  9. Intelligent Tools and Instructional Simulations

    National Research Council Canada - National Science Library

    Murray, William R; Sams, Michelle; Belleville, Michael

    2001-01-01

    This intelligent tools and instructional simulations project was an investigation into the utility of a knowledge-based performance support system to support learning and on-task performance for using...

  10. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  11. Computational intelligence techniques in health care

    CERN Document Server

    Zhou, Wengang; Satheesh, P

    2016-01-01

    This book presents research on emerging computational intelligence techniques and tools, with a particular focus on new trends and applications in health care. Healthcare is a multi-faceted domain, which incorporates advanced decision-making, remote monitoring, healthcare logistics, operational excellence and modern information systems. In recent years, the use of computational intelligence methods to address the scale and the complexity of the problems in healthcare has been investigated. This book discusses various computational intelligence methods that are implemented in applications in different areas of healthcare. It includes contributions by practitioners, technology developers and solution providers.

  12. Affective Computing and Intelligent Interaction

    CERN Document Server

    2012-01-01

    2012 International Conference on Affective Computing and Intelligent Interaction (ICACII 2012) was the most comprehensive conference focused on the various aspects of advances in Affective Computing and Intelligent Interaction. The conference provided a rare opportunity to bring together worldwide academic researchers and practitioners for exchanging the latest developments and applications in this field such as Intelligent Computing, Affective Computing, Machine Learning, Business Intelligence and HCI.   This volume is a collection of 119 papers selected from 410 submissions from universities and industries all over the world, based on their quality and relevancy to the conference. All of the papers have been peer-reviewed by selected experts.  

  13. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  14. Computational intelligence, medicine and biology selected links

    CERN Document Server

    Zaitseva, Elena

    2015-01-01

    This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominato...

  15. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  16. Computational Intelligence for Engineering Systems

    CERN Document Server

    Madureira, A; Vale, Zita

    2011-01-01

    "Computational Intelligence for Engineering Systems" provides an overview and original analysis of new developments and advances in several areas of computational intelligence. Computational Intelligence have become the road-map for engineers to develop and analyze novel techniques to solve problems in basic sciences (such as physics, chemistry and biology) and engineering, environmental, life and social sciences. The contributions are written by international experts, who provide up-to-date aspects of the topics discussed and present recent, original insights into their own experien

  17. Computational Foundations of Natural Intelligence.

    Science.gov (United States)

    van Gerven, Marcel

    2017-01-01

    New developments in AI and neuroscience are revitalizing the quest to understanding natural intelligence, offering insight about how to equip machines with human-like capabilities. This paper reviews some of the computational principles relevant for understanding natural intelligence and, ultimately, achieving strong AI. After reviewing basic principles, a variety of computational modeling approaches is discussed. Subsequently, I concentrate on the use of artificial neural networks as a framework for modeling cognitive processes. This paper ends by outlining some of the challenges that remain to fulfill the promise of machines that show human-like intelligence.

  18. Computational Foundations of Natural Intelligence

    Directory of Open Access Journals (Sweden)

    Marcel van Gerven

    2017-12-01

    Full Text Available New developments in AI and neuroscience are revitalizing the quest to understanding natural intelligence, offering insight about how to equip machines with human-like capabilities. This paper reviews some of the computational principles relevant for understanding natural intelligence and, ultimately, achieving strong AI. After reviewing basic principles, a variety of computational modeling approaches is discussed. Subsequently, I concentrate on the use of artificial neural networks as a framework for modeling cognitive processes. This paper ends by outlining some of the challenges that remain to fulfill the promise of machines that show human-like intelligence.

  19. Computational intelligence in biomedical imaging

    CERN Document Server

    2014-01-01

    This book provides a comprehensive overview of the state-of-the-art computational intelligence research and technologies in biomedical images with emphasis on biomedical decision making. Biomedical imaging offers useful information on patients’ medical conditions and clues to causes of their symptoms and diseases. Biomedical images, however, provide a large number of images which physicians must interpret. Therefore, computer aids are demanded and become indispensable in physicians’ decision making. This book discusses major technical advancements and research findings in the field of computational intelligence in biomedical imaging, for example, computational intelligence in computer-aided diagnosis for breast cancer, prostate cancer, and brain disease, in lung function analysis, and in radiation therapy. The book examines technologies and studies that have reached the practical level, and those technologies that are becoming available in clinical practices in hospitals rapidly such as computational inte...

  20. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  1. Case studies in intelligent computing achievements and trends

    CERN Document Server

    Issac, Biju

    2014-01-01

    Although the field of intelligent systems has grown rapidly in recent years, there has been a need for a book that supplies a timely and accessible understanding of this important technology. Filling this need, Case Studies in Intelligent Computing: Achievements and Trends provides an up-to-date introduction to intelligent systems.This edited book captures the state of the art in intelligent computing research through case studies that examine recent developments, developmental tools, programming, and approaches related to artificial intelligence (AI). The case studies illustrate successful ma

  2. Soft computing for business intelligence

    CERN Document Server

    Pérez, Rafael; Cobo, Angel; Marx, Jorge; Valdés, Ariel

    2014-01-01

    The book Soft Computing for Business Intelligence is the remarkable output of a program based on the idea of joint trans-disciplinary research as supported by the Eureka Iberoamerica Network and the University of Oldenburg. It contains twenty-seven papers allocated to three sections: Soft Computing, Business Intelligence and Knowledge Discovery, and Knowledge Management and Decision Making. Although the contents touch different domains they are similar in so far as they follow the BI principle “Observation and Analysis” while keeping a practical oriented theoretical eye on sound methodologies, like Fuzzy Logic, Compensatory Fuzzy Logic (CFL), Rough Sets and other softcomputing elements. The book tears down the traditional focus on business, and extends Business Intelligence techniques in an impressive way to a broad range of fields like medicine, environment, wind farming, social collaboration and interaction, car sharing and sustainability.

  3. Visualizing the Computational Intelligence Field

    NARCIS (Netherlands)

    L. Waltman (Ludo); J.H. van den Berg (Jan); U. Kaymak (Uzay); N.J.P. van Eck (Nees Jan)

    2006-01-01

    textabstractIn this paper, we visualize the structure and the evolution of the computational intelligence (CI) field. Based on our visualizations, we analyze the way in which the CI field is divided into several subfields. The visualizations provide insight into the characteristics of each subfield

  4. 1st International Conference on Intelligent Computing, Communication and Devices

    CERN Document Server

    Patnaik, Srikanta; Ichalkaranje, Nikhil

    2015-01-01

    In the history of mankind, three revolutions which impact the human life are the tool-making revolution, agricultural revolution and industrial revolution. They have transformed not only the economy and civilization but the overall development of the society. Probably, intelligence revolution is the next revolution, which the society will perceive in the next 10 years. ICCD-2014 covers all dimensions of intelligent sciences, i.e. Intelligent Computing, Intelligent Communication and Intelligent Devices. This volume covers contributions from Intelligent Communication which are from the areas such as Communications and Wireless Ad Hoc & Sensor Networks, Speech & Natural Language Processing, including Signal, Image and Video Processing and Mobile broadband and Optical networks, which are the key to the ground-breaking inventions to intelligent communication technologies. Secondly, Intelligent Device is any type of equipment, instrument, or machine that has its own computing capability. Contributions from ...

  5. Artificial intelligence and computer vision

    CERN Document Server

    Li, Yujie

    2017-01-01

    This edited book presents essential findings in the research fields of artificial intelligence and computer vision, with a primary focus on new research ideas and results for mathematical problems involved in computer vision systems. The book provides an international forum for researchers to summarize the most recent developments and ideas in the field, with a special emphasis on the technical and observational results obtained in the past few years.

  6. Computational Intelligence and Decision Making Trends and Applications

    CERN Document Server

    Madureira, Ana; Marques, Viriato

    2013-01-01

    This book provides a general overview and original analysis of new developments and applications in several areas of Computational Intelligence and Information Systems. Computational Intelligence has become an important tool for engineers to develop and analyze novel techniques to solve problems in basic sciences such as physics, chemistry, biology, engineering, environment and social sciences.   The material contained in this book addresses the foundations and applications of Artificial Intelligence and Decision Support Systems, Complex and Biological Inspired Systems, Simulation and Evolution of Real and Artificial Life Forms, Intelligent Models and Control Systems, Knowledge and Learning Technologies, Web Semantics and Ontologies, Intelligent Tutoring Systems, Intelligent Power Systems, Self-Organized and Distributed Systems, Intelligent Manufacturing Systems and Affective Computing. The contributions have all been written by international experts, who provide current views on the topics discussed and pr...

  7. Intelligent Buildings and pervasive computing

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Krogh, Peter Gall

    2001-01-01

    computers are everywhere, for everyone, at all times. Where IT becomes a still more integrated part of our environments with processors, sensors, and actuators connected via high-speed networks and combined with new visualiza-tion devices ranging from projections directly in the eye to large panorama......Intelligent Buildings have been the subject of research and commercial interest for more than two decades. The different perspectives range from monitoring and controlling energy consumption over interactive rooms supporting work in offices and leisure in the home, to buildings providing...... information to by-passers in plazas and urban environments. This paper puts forward the hypothesis that the coming decade will witness a dramatic increase in both quality and quantity of intelligent buildings due to the emerging field of pervasive computing: the next generation computing environments where...

  8. Computational intelligence in nuclear engineering

    International Nuclear Information System (INIS)

    Uhrig, Robert E.; Hines, J. Wesley

    2005-01-01

    Approaches to several recent issues in the operation of nuclear power plants using computational intelligence are discussed. These issues include 1) noise analysis techniques, 2) on-line monitoring and sensor validation, 3) regularization of ill-posed surveillance and diagnostic measurements, 4) transient identification, 5) artificial intelligence-based core monitoring and diagnostic system, 6) continuous efficiency improvement of nuclear power plants, and 7) autonomous anticipatory control and intelligent-agents. Several Changes to the focus of Computational Intelligence in Nuclear Engineering have occurred in the past few years. With earlier activities focusing on the development of condition monitoring and diagnostic techniques for current nuclear power plants, recent activities have focused on the implementation of those methods and the development of methods for next generation plants and space reactors. These advanced techniques are expected to become increasingly important as current generation nuclear power plants have their licenses extended to 60 years and next generation reactors are being designed to operate for extended fuel cycles (up to 25 years), with less operator oversight, and especially for nuclear plants operating in severe environments such as space or ice-bound locations

  9. Improving Tools in Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-01-01

    Full Text Available The historical origin of the Artificial Intelligence (AI is usually established in the Dartmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadeh, for instance [12, 14]. Frequently AI requires Logic. But its Classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as Fuzzy Logic, Modal Logic, Non-Monotonic Logic and so on [1, 2]. Among the things that AI needs to represent are categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in AI can be classified in two general types [3, 5], search problems and representation problems. On this last "peak", there exist different ways to reach their summit. So, we have [4] Logics, Rules, Frames, Associative Nets, Scripts, and so on, many times connected among them. We attempt, in this paper, a panoramic vision of the scope of application of such representation methods in AI. The two more disputable questions of both modern philosophy of mind and AI will be perhaps the Turing Test and the Chinese Room Argument. To elucidate these very difficult questions, see our final note.

  10. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  11. Computational Intelligence : International Joint Conference

    CERN Document Server

    Rosa, Agostinho; Cadenas, José; Dourado, António; Madani, Kurosh; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the sixth International Joint Conference on Computational Intelligence (IJCCI 2014), held in Rome, Italy, from 22 to 24 October 2014. The conference was composed by three co-located conferences:  The International Conference on Evolutionary Computation Theory and Applications (ECTA), the International Conference on Fuzzy Computation Theory and Applications (FCTA), and the International Conference on Neural Computation Theory and Applications (NCTA). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 210 submissions, from 51 countries, in all continents. After a double blind paper review performed by the Program Committee, 15% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience in...

  12. Computational Intelligence : International Joint Conference

    CERN Document Server

    Dourado, António; Rosa, Agostinho; Filipe, Joaquim; Kacprzyk, Janusz

    2016-01-01

    The present book includes a set of selected extended papers from the fifth International Joint Conference on Computational Intelligence (IJCCI 2013), held in Vilamoura, Algarve, Portugal, from 20 to 22 September 2013. The conference was composed by three co-located conferences:  The International Conference on Evolutionary Computation Theory and Applications (ECTA), the International Conference on Fuzzy Computation Theory and Applications (FCTA), and the International Conference on Neural Computation Theory and Applications (NCTA). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 111 submissions, from 30 countries, in all continents. After a double blind paper review performed by the Program Committee, only 24 submissions were accepted as full papers and thus selected for oral presentation, leading to a full paper acceptance ratio of 22%. Additional papers were accepted as short papers and posters. A further selection was made after ...

  13. Information granularity, big data, and computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2015-01-01

    The recent pursuits emerging in the realm of big data processing, interpretation, collection and organization have emerged in numerous sectors including business, industry, and government organizations. Data sets such as customer transactions for a mega-retailer, weather monitoring, intelligence gathering, quickly outpace the capacities of traditional techniques and tools of data analysis. The 3V (volume, variability and velocity) challenges led to the emergence of new techniques and tools in data visualization, acquisition, and serialization. Soft Computing being regarded as a plethora of technologies of fuzzy sets (or Granular Computing), neurocomputing and evolutionary optimization brings forward a number of unique features that might be instrumental to the development of concepts and algorithms to deal with big data. This carefully edited volume provides the reader with an updated, in-depth material on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligenc...

  14. Computational intelligence in automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, Danil (ed.) [Toyota Motor Engineering and Manufacturing (TEMA), Ann Arbor, MI (United States). Toyota Technical Center

    2008-07-01

    What is computational intelligence (CI)? Traditionally, CI is understood as a collection of methods from the fields of neural networks (NN), fuzzy logic and evolutionary computation. This edited volume is the first of its kind, suitable to automotive researchers, engineers and students. It provides a representative sample of contemporary CI activities in the area of automotive technology. The volume consists of 13 chapters, including but not limited to these topics: vehicle diagnostics and vehicle system safety, control of vehicular systems, quality control of automotive processes, driver state estimation, safety of pedestrians, intelligent vehicles. All chapters contain overviews of state of the art, and several chapters illustrate their methodologies on examples of real-world systems. About the Editor: Danil Prokhorov began his technical career in St. Petersburg, Russia, after graduating with Honors from Saint Petersburg State University of Aerospace Instrumentation in 1992 (MS in Robotics). He worked as a research engineer in St. Petersburg Institute for Informatics and Automation, one of the institutes of the Russian Academy of Sciences. He came to the US in late 1993 for Ph.D. studies. He became involved in automotive research in 1995 when he was a Summer intern at Ford Scientific Research Lab in Dearborn, MI. Upon his graduation from the EE Department of Texas Tech University, Lubbock, in 1997, he joined Ford to pursue application-driven research on neural networks and other machine learning algorithms. While at Ford, he took part in several production-bound projects including neural network based engine misfire detection. Since 2005 he is with Toyota Technical Center, Ann Arbor, MI, overseeing important mid- and long-term research projects in computational intelligence. (orig.)

  15. International Conference on Computational Intelligence 2015

    CERN Document Server

    Saha, Sujan

    2017-01-01

    This volume comprises the proceedings of the International Conference on Computational Intelligence 2015 (ICCI15). This book aims to bring together work from leading academicians, scientists, researchers and research scholars from across the globe on all aspects of computational intelligence. The work is composed mainly of original and unpublished results of conceptual, constructive, empirical, experimental, or theoretical work in all areas of computational intelligence. Specifically, the major topics covered include classical computational intelligence models and artificial intelligence, neural networks and deep learning, evolutionary swarm and particle algorithms, hybrid systems optimization, constraint programming, human-machine interaction, computational intelligence for the web analytics, robotics, computational neurosciences, neurodynamics, bioinspired and biomorphic algorithms, cross disciplinary topics and applications. The contents of this volume will be of use to researchers and professionals alike....

  16. Applied Computational Intelligence for finance and economics

    OpenAIRE

    Isasi Viñuela, Pedro; Quintana Montero, David; Sáez Achaerandio, Yago; Mochón, Asunción

    2007-01-01

    This article introduces some relevant research works on computational intelligence applied to finance and economics. The objective is to offer an appropriate context and a starting point for those who are new to computational intelligence in finance and economics and to give an overview of the most recent works. A classification with five different main areas is presented. Those areas are related with different applications of the most modern computational intelligence techniques showing a ne...

  17. Computational Intelligence in Information Systems Conference

    CERN Document Server

    Au, Thien-Wan; Omar, Saiful

    2017-01-01

    This book constitutes the Proceedings of the Computational Intelligence in Information Systems conference (CIIS 2016), held in Brunei, November 18–20, 2016. The CIIS conference provides a platform for researchers to exchange the latest ideas and to present new research advances in general areas related to computational intelligence and its applications. The 26 revised full papers presented in this book have been carefully selected from 62 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.

  18. Application of computational intelligence to biology

    CERN Document Server

    Sekhar, Akula

    2016-01-01

    This book is a contribution of translational and allied research to the proceedings of the International Conference on Computational Intelligence and Soft Computing. It explains how various computational intelligence techniques can be applied to investigate various biological problems. It is a good read for Research Scholars, Engineers, Medical Doctors and Bioinformatics researchers.

  19. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  20. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  1. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  2. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  3. 16th UK Workshop on Computational Intelligence

    CERN Document Server

    Gegov, Alexander; Jayne, Chrisina; Shen, Qiang

    2017-01-01

    The book is a timely report on advanced methods and applications of computational intelligence systems. It covers a long list of interconnected research areas, such as fuzzy systems, neural networks, evolutionary computation, evolving systems and machine learning. The individual chapters are based on peer-reviewed contributions presented at the 16th Annual UK Workshop on Computational Intelligence, held on September 7-9, 2016, in Lancaster, UK. The book puts a special emphasis on novels methods and reports on their use in a wide range of applications areas, thus providing both academics and professionals with a comprehensive and timely overview of new trends in computational intelligence.

  4. Soft computing in artificial intelligence

    CERN Document Server

    Matson, Eric

    2014-01-01

    This book explores the concept of artificial intelligence based on knowledge-based algorithms. Given the current hardware and software technologies and artificial intelligence theories, we can think of how efficient to provide a solution, how best to implement a model and how successful to achieve it. This edition provides readers with the most recent progress and novel solutions in artificial intelligence. This book aims at presenting the research results and solutions of applications in relevance with artificial intelligence technologies. We propose to researchers and practitioners some methods to advance the intelligent systems and apply artificial intelligence to specific or general purpose. This book consists of 13 contributions that feature fuzzy (r, s)-minimal pre- and β-open sets, handling big coocurrence matrices, Xie-Beni-type fuzzy cluster validation, fuzzy c-regression models, combination of genetic algorithm and ant colony optimization, building expert system, fuzzy logic and neural network, ind...

  5. 10th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Seghrouchni, Amal; Beynier, Aurélie; Camacho, David; Herpson, Cédric; Hindriks, Koen; Novais, Paulo

    2017-01-01

    This book presents the combined peer-reviewed proceedings of the tenth International Symposium on Intelligent Distributed Computing (IDC’2016), which was held in Paris, France from October 10th to 12th, 2016. The 23 contributions address a range of topics related to theory and application of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  6. Firearm microstamping technology: counterinsurgency intelligence gathering tool

    Science.gov (United States)

    Lizotte, Todd E.; Ohar, Orest P.

    2009-05-01

    Warfare relies on effective, accurate and timely intelligence an especially critical task when conducting a counterinsurgency operation [1]. Simply stated counterinsurgency is an intelligence war. Both insurgents and counterinsurgents need effective intelligence capabilities to be successful. Insurgents and counterinsurgents therefore attempt to create and maintain intelligence networks and fight continuously to neutralize each other's intelligence capabilities [1][2]. In such an environment it is obviously an advantage to target or proactively create opportunities to track and map an insurgent movement. Quickly identifying insurgency intelligence assets (Infiltrators) within a host government's infrastructure is the goal. Infiltrators can occupy various areas of government such as security personnel, national police force, government offices or military units. Intentional Firearm Microstamping offers such opportunities when implemented into firearms. Outfitted within firearms purchased and distributed to the host nation's security forces (civilian and military), Intentional Firearm Microstamping (IFM) marks bullet cartridge casings with codes as they are fired from the firearm. IFM is incorporated onto optimum surfaces with the firearm mechanism. The intentional microstamp tooling marks can take the form of alphanumeric codes or encoded geometric codes that identify the firearm. As the firearm is discharged the intentional tooling marks transfer a code to the cartridge casing which is ejected out of the firearm. When recovered at the scene of a firefight or engagement, the technology will provide forensic intelligence allowing the mapping and tracking of small arms traffic patterns within the host nation or identify insurgency force strength and pinpoint firearm sources, such as corrupt/rogue military units or police force. Intentional Firearm Microstamping is a passive mechanical trace technology that can be outfitted or retrofitted to semiautomatic handguns and

  7. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  8. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  9. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  10. Intelligent computing for sustainable energy and environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang [Queen' s Univ. Belfast (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Li, Shaoyuan; Li, Dewei [Shanghai Jiao Tong Univ., Shanghai (China). Dept. of Automation; Niu, Qun (eds.) [Shanghai Univ. (China). School of Mechatronic Engineering and Automation

    2013-07-01

    Fast track conference proceedings. State of the art research. Up to date results. This book constitutes the refereed proceedings of the Second International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2012, held in Shanghai, China, in September 2012. The 60 full papers presented were carefully reviewed and selected from numerous submissions and present theories and methodologies as well as the emerging applications of intelligent computing in sustainable energy and environment.

  11. Soft computing in intelligent control

    CERN Document Server

    Jung, Jin-Woo; Kubota, Naoyuki

    2014-01-01

    Nowadays, people have tendency to be fond of smarter machines that are able to collect data, make learning, recognize things, infer meanings, communicate with human and perform behaviors. Thus, we have built advanced intelligent control affecting all around societies; automotive, rail, aerospace, defense, energy, healthcare, telecoms and consumer electronics, finance, urbanization. Consequently, users and consumers can take new experiences through the intelligent control systems. We can reshape the technology world and provide new opportunities for industry and business, by offering cost-effective, sustainable and innovative business models. We will have to know how to create our own digital life. The intelligent control systems enable people to make complex applications, to implement system integration and to meet society’s demand for safety and security. This book aims at presenting the research results and solutions of applications in relevance with intelligent control systems. We propose to researchers ...

  12. Artificial intelligence - New tools for aerospace project managers

    Science.gov (United States)

    Moja, D. C.

    1985-01-01

    Artificial Intelligence (AI) is currently being used for business-oriented, money-making applications, such as medical diagnosis, computer system configuration, and geological exploration. The present paper has the objective to assess new AI tools and techniques which will be available to assist aerospace managers in the accomplishment of their tasks. A study conducted by Brown and Cheeseman (1983) indicates that AI will be employed in all traditional management areas, taking into account goal setting, decision making, policy formulation, evaluation, planning, budgeting, auditing, personnel management, training, legal affairs, and procurement. Artificial intelligence/expert systems are discussed, giving attention to the three primary areas concerned with intelligent robots, natural language interfaces, and expert systems. Aspects of information retrieval are also considered along with the decision support system, and expert systems for project planning and scheduling.

  13. A computer architecture for intelligent machines

    Science.gov (United States)

    Lefebvre, D. R.; Saridis, G. N.

    1992-01-01

    The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  14. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  15. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  16. Delamination detection using methods of computational intelligence

    Science.gov (United States)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  17. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  18. Computational Intelligence Techniques for New Product Design

    CERN Document Server

    Chan, Kit Yan; Dillon, Tharam S

    2012-01-01

    Applying computational intelligence for product design is a fast-growing and promising research area in computer sciences and industrial engineering. However, there is currently a lack of books, which discuss this research area. This book discusses a wide range of computational intelligence techniques for implementation on product design. It covers common issues on product design from identification of customer requirements in product design, determination of importance of customer requirements, determination of optimal design attributes, relating design attributes and customer satisfaction, integration of marketing aspects into product design, affective product design, to quality control of new products. Approaches for refinement of computational intelligence are discussed, in order to address different issues on product design. Cases studies of product design in terms of development of real-world new products are included, in order to illustrate the design procedures, as well as the effectiveness of the com...

  19. Intelligent Distributed Computing VI : Proceedings of the 6th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Badica, Costin; Malgeri, Michele; Unland, Rainer

    2013-01-01

    This book represents the combined peer-reviewed proceedings of the Sixth International Symposium on Intelligent Distributed Computing -- IDC~2012, of the International Workshop on Agents for Cloud -- A4C~2012 and of the Fourth International Workshop on Multi-Agent Systems Technology and Semantics -- MASTS~2012. All the events were held in Calabria, Italy during September 24-26, 2012. The 37 contributions published in this book address many topics related to theory and applications of intelligent distributed computing and multi-agent systems, including: adaptive and autonomous distributed systems, agent programming, ambient assisted living systems, business process modeling and verification, cloud computing, coalition formation, decision support systems, distributed optimization and constraint satisfaction, gesture recognition, intelligent energy management in WSNs, intelligent logistics, machine learning, mobile agents, parallel and distributed computational intelligence, parallel evolutionary computing, trus...

  20. Computational Intelligence Agent-Oriented Modelling

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2006-01-01

    Roč. 5, č. 2 (2006), s. 430-433 ISSN 1109-2777 R&D Projects: GA MŠk 1M0567 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * adaptive agents * computational intelligence Subject RIV: IN - Informatics, Computer Science

  1. Intelligent systems and soft computing for nuclear science and industry

    International Nuclear Information System (INIS)

    Ruan, D.; D'hondt, P.; Govaerts, P.; Kerre, E.E.

    1996-01-01

    The second international workshop on Fuzzy Logic and Intelligent Technologies in Nuclear Science (FLINS) addresses topics related to intelligent systems and soft computing for nuclear science and industry. The proceedings contain 52 papers in different fields such as radiation protection, nuclear safety (human factors and reliability), safeguards, nuclear reactor control, production processes in the fuel cycle, dismantling, waste and disposal, decision making, and nuclear reactor control. A clear link is made between theory and applications of fuzzy logic such as neural networks, expert systems, robotics, man-machine interfaces, and decision-support techniques by using modern and advanced technologies and tools. The papers are grouped in three sections. The first section (Soft computing techniques) deals with basic tools to treat fuzzy logic, neural networks, genetic algorithms, decision-making, and software used for general soft-computing aspects. The second section (Intelligent engineering systems) includes contributions on engineering problems such as knowledge-based engineering, expert systems, process control integration, diagnosis, measurements, and interpretation by soft computing. The third section (Nuclear applications) focusses on the application of soft computing and intelligent systems in nuclear science and industry

  2. 2nd International Conference on Intelligent Computing, Communication & Devices

    CERN Document Server

    Popentiu-Vladicescu, Florin

    2017-01-01

    The book presents high quality papers presented at 2nd International Conference on Intelligent Computing, Communication & Devices (ICCD 2016) organized by Interscience Institute of Management and Technology (IIMT), Bhubaneswar, Odisha, India, during 13 and 14 August, 2016. The book covers all dimensions of intelligent sciences in its three tracks, namely, intelligent computing, intelligent communication and intelligent devices. intelligent computing track covers areas such as intelligent and distributed computing, intelligent grid and cloud computing, internet of things, soft computing and engineering applications, data mining and knowledge discovery, semantic and web technology, hybrid systems, agent computing, bioinformatics, and recommendation systems. Intelligent communication covers communication and network technologies, including mobile broadband and all optical networks that are the key to groundbreaking inventions of intelligent communication technologies. This covers communication hardware, soft...

  3. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  4. An intelligent tool for activity data collection.

    Science.gov (United States)

    Sarkar, A M Jehad

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.

  5. Using generic tool kits to build intelligent systems

    Science.gov (United States)

    Miller, David J.

    1994-01-01

    The Intelligent Systems and Robots Center at Sandia National Laboratories is developing technologies for the automation of processes associated with environmental remediation and information-driven manufacturing. These technologies, which focus on automated planning and programming and sensor-based and model-based control, are used to build intelligent systems which are able to generate plans of action, program the necessary devices, and use sensors to react to changes in the environment. By automating tasks through the use of programmable devices tied to computer models which are augmented by sensing, requirements for faster, safer, and cheaper systems are being satisfied. However, because of the need for rapid cost-effect prototyping and multi-laboratory teaming, it is also necessary to define a consistent approach to the construction of controllers for such systems. As a result, the Generic Intelligent System Controller (GISC) concept has been developed. This concept promotes the philosophy of producing generic tool kits which can be used and reused to build intelligent control systems.

  6. An Intelligent Tool for Activity Data Collection

    Directory of Open Access Journals (Sweden)

    A. M. Jehad Sarkar

    2011-04-01

    Full Text Available Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets.

  7. Computational Intelligence Paradigms in Advanced Pattern Classification

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.

  8. Artificial Intelligence, Computational Thinking, and Mathematics Education

    Science.gov (United States)

    Gadanidis, George

    2017-01-01

    Purpose: The purpose of this paper is to examine the intersection of artificial intelligence (AI), computational thinking (CT), and mathematics education (ME) for young students (K-8). Specifically, it focuses on three key elements that are common to AI, CT and ME: agency, modeling of phenomena and abstracting concepts beyond specific instances.…

  9. Computational intelligence in medical informatics

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  10. Intelligent Tutoring System: A Tool for Testing the Research Curiosities of Artificial Intelligence Researchers

    Science.gov (United States)

    Yaratan, Huseyin

    2003-01-01

    An ITS (Intelligent Tutoring System) is a teaching-learning medium that uses artificial intelligence (AI) technology for instruction. Roberts and Park (1983) defines AI as the attempt to get computers to perform tasks that if performed by a human-being, intelligence would be required to perform the task. The design of an ITS comprises two distinct…

  11. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  12. Artifical Intelligence for Human Computing

    NARCIS (Netherlands)

    Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

    2007-01-01

    This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

  13. Unified Computational Intelligence for Complex Systems

    CERN Document Server

    Seiffertt, John

    2010-01-01

    Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to e

  14. Operator support system using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bueno, Elaine Inacio, E-mail: ebueno@ifsp.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Sao Paulo, SP (Brazil); Pereira, Iraci Martinez, E-mail: martinez@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  15. Operator support system using computational intelligence techniques

    International Nuclear Information System (INIS)

    Bueno, Elaine Inacio; Pereira, Iraci Martinez

    2015-01-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  16. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  17. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  18. Social networks a framework of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2014-01-01

    This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms.  The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling.  Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social network...

  19. Cloud Computing and Business Intelligence

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2015-03-01

    Full Text Available The complexity of data resulting from business process is becoming overwhelming for the systems that don't use shared resources. Many aspects of the business process must be recorded and analysed in a short period of time with no errors at all. In order to obtain these results, so that management and other departments know what their next decision/job will be, there must be a continuous exchange and processing of information. "Cloud Computing" is the solution to overcome the problem of processing large amounts of data. By using this technology organizations have the benefit of using shared resources from various systems that are able to face large amount of data processing. This benefits does not only resume to a high performance system but also the costs of using such architecture are much lower.

  20. Application of computational intelligence in emerging power systems

    African Journals Online (AJOL)

    ... in the electrical engineering applications. This paper highlights the application of computational intelligence methods in power system problems. Various types of CI methods, which are widely used in power system, are also discussed in the brief. Keywords: Power systems, computational intelligence, artificial intelligence.

  1. Intelligent computer-aided training and tutoring

    Science.gov (United States)

    Loftin, R. Bowen; Savely, Robert T.

    1991-01-01

    Specific autonomous training systems based on artificial intelligence technology for use by NASA astronauts, flight controllers, and ground-based support personnel that demonstrate an alternative to current training systems are described. In addition to these specific systems, the evolution of a general architecture for autonomous intelligent training systems that integrates many of the features of traditional training programs with artificial intelligence techniques is presented. These Intelligent Computer-Aided Training (ICAT) systems would provide, for the trainee, much of the same experience that could be gained from the best on-the-job training. By integrating domain expertise with a knowledge of appropriate training methods, an ICAT session should duplicate, as closely as possible, the trainee undergoing on-the-job training in the task environment, benefitting from the full attention of a task expert who is also an expert trainer. Thus, the philosophy of the ICAT system is to emulate the behavior of an experienced individual devoting his full time and attention to the training of a novice - proposing challenging training scenarios, monitoring and evaluating the actions of the trainee, providing meaningful comments in response to trainee errors, responding to trainee requests for information, giving hints (if appropriate), and remembering the strengths and weaknesses displayed by the trainee so that appropriate future exercises can be designed.

  2. 1st International Conference on Intelligent Computing and Communication

    CERN Document Server

    Satapathy, Suresh; Sanyal, Manas; Bhateja, Vikrant

    2017-01-01

    The book covers a wide range of topics in Computer Science and Information Technology including swarm intelligence, artificial intelligence, evolutionary algorithms, and bio-inspired algorithms. It is a collection of papers presented at the First International Conference on Intelligent Computing and Communication (ICIC2) 2016. The prime areas of the conference are Intelligent Computing, Intelligent Communication, Bio-informatics, Geo-informatics, Algorithm, Graphics and Image Processing, Graph Labeling, Web Security, Privacy and e-Commerce, Computational Geometry, Service Orient Architecture, and Data Engineering.

  3. A proposal of ubiquitous fuzzy computing for ambient Intelligence

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2008-01-01

    Ambient Intelligence is considered as the composition of three emergent technologies: Ubiquitous Computing, Ubiquitous Communication and Intelligent User Interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information technology

  4. Ubiquitous fuzzy computing in open ambient intelligence environments

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2006-01-01

    Ambient intelligence (AmI) is considered as the composition of three emergent technologies: ubiquitous computing, ubiquitous communication and intelligent user interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  5. Trends in ambient intelligent systems the role of computational intelligence

    CERN Document Server

    Khan, Mohammad; Abraham, Ajith

    2016-01-01

    This book demonstrates the success of Ambient Intelligence in providing possible solutions for the daily needs of humans. The book addresses implications of ambient intelligence in areas of domestic living, elderly care, robotics, communication, philosophy and others. The objective of this edited volume is to show that Ambient Intelligence is a boon to humanity with conceptual, philosophical, methodical and applicative understanding. The book also aims to schematically demonstrate developments in the direction of augmented sensors, embedded systems and behavioral intelligence towards Ambient Intelligent Networks or Smart Living Technology. It contains chapters in the field of Ambient Intelligent Networks, which received highly positive feedback during the review process. The book contains research work, with in-depth state of the art from augmented sensors, embedded technology and artificial intelligence along with cutting-edge research and development of technologies and applications of Ambient Intelligent N...

  6. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  7. The role of soft computing in intelligent machines.

    Science.gov (United States)

    de Silva, Clarence W

    2003-08-15

    An intelligent machine relies on computational intelligence in generating its intelligent behaviour. This requires a knowledge system in which representation and processing of knowledge are central functions. Approximation is a 'soft' concept, and the capability to approximate for the purposes of comparison, pattern recognition, reasoning, and decision making is a manifestation of intelligence. This paper examines the use of soft computing in intelligent machines. Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human. This paper explores several important characteristics and capabilities of machines that exhibit intelligent behaviour. Approaches that are useful in the development of an intelligent machine are introduced. The paper presents a general structure for an intelligent machine, giving particular emphasis to its primary components, such as sensors, actuators, controllers, and the communication backbone, and their interaction. The role of soft computing within the overall system is discussed. Common techniques and approaches that will be useful in the development of an intelligent machine are introduced, and the main steps in the development of an intelligent machine for practical use are given. An industrial machine, which employs the concepts of soft computing in its operation, is presented, and one aspect of intelligent tuning, which is incorporated into the machine, is illustrated.

  8. Tool path strategy and cutting process monitoring in intelligent machining

    Science.gov (United States)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  9. Intelligent tools for building a scientific information platform from research to implementation

    CERN Document Server

    Skonieczny, Łukasz; Rybiński, Henryk; Kryszkiewicz, Marzena; Niezgódka, Marek

    2014-01-01

    This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building...

  10. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  11. Intelligent Support for a Computer Aided Design Optimisation Cycle

    OpenAIRE

    B. Dolšak; M. Novak; J. Kaljun

    2006-01-01

    It is becoming more and more evident that  adding intelligence  to existing computer aids, such as computer aided design systems, can lead to significant improvements in the effective and reliable performance of various engineering tasks, including design optimisation. This paper presents three different intelligent modules to be applied within a computer aided design optimisation cycle to enable more intelligent and less experience-dependent design performance. 

  12. Smart information systems computational intelligence for real-life applications

    CERN Document Server

    Hopfgartner, Frank

    2015-01-01

    This must-read text/reference presents an overview of smart information systems for both the private and public sector, highlighting the research questions that can be studied by applying computational intelligence. The book demonstrates how to transform raw data into effective smart information services, covering the challenges and potential of this approach. Each chapter describes the algorithms, tools, measures and evaluations used to answer important questions. This is then further illustrated by a diverse selection of case studies reflecting genuine problems faced by SMEs, multinational

  13. Using artificial intelligence to control fluid flow computations

    Science.gov (United States)

    Gelsey, Andrew

    1992-01-01

    Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.

  14. 7th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Jung, Jason; Badica, Costin

    2014-01-01

    This book represents the combined peer-reviewed proceedings of the Seventh International Symposium on Intelligent Distributed Computing - IDC-2013, of the Second Workshop on Agents for Clouds - A4C-2013, of the Fifth International Workshop on Multi-Agent Systems Technology and Semantics - MASTS-2013, and of the International Workshop on Intelligent Robots - iR-2013. All the events were held in Prague, Czech Republic during September 4-6, 2013. The 41 contributions published in this book address many topics related to theory and applications of intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, bio-informatics, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, intelligent robotics, knowledge management, linked data, mobile agents, ontologies, pervasive computing, self-organizing systems, peer-to-peer computing, social networks and trust, and swarm intelligence.  .

  15. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  16. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  17. 9th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Camacho, David; Analide, Cesar; Seghrouchni, Amal; Badica, Costin

    2016-01-01

    This book represents the combined peer-reviewed proceedings of the ninth International Symposium on Intelligent Distributed Computing – IDC’2015, of the Workshop on Cyber Security and Resilience of Large-Scale Systems – WSRL’2015, and of the International Workshop on Future Internet and Smart Networks – FI&SN’2015. All the events were held in Guimarães, Portugal during October 7th-9th, 2015. The 46 contributions published in this book address many topics related to theory and applications of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  18. 1st International Conference on Computational Intelligence and Informatics

    CERN Document Server

    Prasad, V; Rani, B; Udgata, Siba; Raju, K

    2017-01-01

    The book covers a variety of topics which include data mining and data warehousing, high performance computing, parallel and distributed computing, computational intelligence, soft computing, big data, cloud computing, grid computing, cognitive computing, image processing, computer networks, wireless networks, social networks, wireless sensor networks, information and network security, web security, internet of things, bioinformatics and geoinformatics. The book is a collection of best papers submitted in the First International Conference on Computational Intelligence and Informatics (ICCII 2016) held during 28-30 May 2016 at JNTUH CEH, Hyderabad, India. It was hosted by Department of Computer Science and Engineering, JNTUH College of Engineering in association with Division V (Education & Research) CSI, India. .

  19. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2014-01-01

    This volume contains the papers presented at the Second International Conference on Frontiers in Intelligent Computing: Theory and Applications (FICTA-2013) held during 14-16 November 2013 organized by Bhubaneswar Engineering College (BEC), Bhubaneswar, Odisha, India. It contains 63 papers focusing on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, Fuzzy systems, Machine Intelligence and ANN, Web technologies and Multimedia applications and Intelligent computing and Networking etc.

  20. 2nd International Conference on Intelligent Computing and Applications

    CERN Document Server

    Dash, Subhransu; Das, Swagatam; Panigrahi, Bijaya

    2017-01-01

    Second International Conference on Intelligent Computing and Applications was the annual research conference aimed to bring together researchers around the world to exchange research results and address open issues in all aspects of Intelligent Computing and Applications. The main objective of the second edition of the conference for the scientists, scholars, engineers and students from the academia and the industry is to present ongoing research activities and hence to foster research relations between the Universities and the Industry. The theme of the conference unified the picture of contemporary intelligent computing techniques as an integral concept that highlights the trends in computational intelligence and bridges theoretical research concepts with applications. The conference covered vital issues ranging from intelligent computing, soft computing, and communication to machine learning, industrial automation, process technology and robotics. This conference also provided variety of opportunities for ...

  1. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we co...

  2. Some Notes About Artificial Intelligence as New Mathematical Tool

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-04-01

    Full Text Available Mathematics is a mere instance of First-Order Predicate Calculus. Therefore it belongs to applied Monotonic Logic. So, we found the limitations of classical logic reasoning and the clear advantages of Fuzzy Logic and many other new interesting tools. We present here some of the more usefulness tools of this new field of Mathematics so-called Artificial Intelligence.

  3. THE COMPUTATIONAL INTELLIGENCE TECHNIQUES FOR PREDICTIONS - ARTIFICIAL NEURAL NETWORKS

    OpenAIRE

    Mary Violeta Bar

    2014-01-01

    The computational intelligence techniques are used in problems which can not be solved by traditional techniques when there is insufficient data to develop a model problem or when they have errors.Computational intelligence, as he called Bezdek (Bezdek, 1992) aims at modeling of biological intelligence. Artificial Neural Networks( ANNs) have been applied to an increasing number of real world problems of considerable complexity. Their most important advantage is solving problems that are too c...

  4. Integrating Human and Computer Intelligence. Technical Report No. 32.

    Science.gov (United States)

    Pea, Roy D.

    This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

  5. Artificial Intelligence Support for Computational Chemistry

    Science.gov (United States)

    Duch, Wlodzislaw

    Possible forms of artificial intelligence (AI) support for quantum chemistry are discussed. Questions addressed include: what kind of support is desirable, what kind of support is feasible, what can we expect in the coming years. Advantages and disadvantages of current AI techniques are presented and it is argued that at present the memory-based systems are the most effective for large scale applications. Such systems may be used to predict the accuracy of calculations and to select the least expensive methods and basis sets belonging to the same accuracy class. Advantages of the Feature Space Mapping as an improvement on the memory based systems are outlined and some results obtained in classification problems given. Relevance of such classification systems to computational chemistry is illustrated with two examples showing similarity of results obtained by different methods that take electron correlation into account.

  6. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  7. Artificial Intelligence In Computational Fluid Dynamics

    Science.gov (United States)

    Vogel, Alison Andrews

    1991-01-01

    Paper compares four first-generation artificial-intelligence (Al) software systems for computational fluid dynamics. Includes: Expert Cooling Fan Design System (EXFAN), PAN AIR Knowledge System (PAKS), grid-adaptation program MITOSIS, and Expert Zonal Grid Generation (EZGrid). Focuses on knowledge-based ("expert") software systems. Analyzes intended tasks, kinds of knowledge possessed, magnitude of effort required to codify knowledge, how quickly constructed, performances, and return on investment. On basis of comparison, concludes Al most successful when applied to well-formulated problems solved by classifying or selecting preenumerated solutions. In contrast, application of Al to poorly understood or poorly formulated problems generally results in long development time and large investment of effort, with no guarantee of success.

  8. Applications of computational intelligence in biomedical technology

    CERN Document Server

    Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena

    2016-01-01

    This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics  This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals.  .

  9. 2010 IEEE World Congress on Computational Intelligence (IEEE WCCI 2010)

    CERN Document Server

    Solanas, Agusti; Martinez-Balleste, Antoni; Computational Intelligence for Privacy and Security

    2012-01-01

    The book is a collection of invited papers on Computational Intelligence for Privacy and Security. The majority of the chapters are extended versions of works presented at the special session on Computational Intelligence for Privacy and Security of the International Joint Conference on Neural Networks (IJCNN-2010) held July 2010 in Barcelona, Spain. The book is devoted to Computational Intelligence for Privacy and Security. It provides an overview of the most recent advances on the Computational Intelligence techniques being developed for Privacy and Security. The book will be of interest to researchers in industry and academics and to post-graduate students interested in the latest advances and developments in the field of Computational Intelligence for Privacy and Security.

  10. Computational intelligence in digital forensics forensic investigation and applications

    CERN Document Server

    Choo, Yun-Huoy; Abraham, Ajith; Srihari, Sargur

    2014-01-01

    Computational Intelligence techniques have been widely explored in various domains including forensics. Analysis in forensic encompasses the study of pattern analysis that answer the question of interest in security, medical, legal, genetic studies and etc. However, forensic analysis is usually performed through experiments in lab which is expensive both in cost and time. Therefore, this book seeks to explore the progress and advancement of computational intelligence technique in different focus areas of forensic studies. This aims to build stronger connection between computer scientists and forensic field experts.   This book, Computational Intelligence in Digital Forensics: Forensic Investigation and Applications, is the first volume in the Intelligent Systems Reference Library series. The book presents original research results and innovative applications of computational intelligence in digital forensics. This edited volume contains seventeen chapters and presents the latest state-of-the-art advancement ...

  11. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Bhateja, Vikrant; Udgata, Siba; Pattnaik, Prasant

    2017-01-01

    The book is a collection of high-quality peer-reviewed research papers presented at International Conference on Frontiers of Intelligent Computing: Theory and applications (FICTA 2016) held at School of Computer Engineering, KIIT University, Bhubaneswar, India during 16 – 17 September 2016. The book presents theories, methodologies, new ideas, experiences and applications in all areas of intelligent computing and its applications to various engineering disciplines like computer science, electronics, electrical and mechanical engineering.

  12. Some Steps towards Intelligent Computer Tutoring Systems.

    Science.gov (United States)

    Tchogovadze, Gotcha G.

    1986-01-01

    Describes one way of structuring an intelligent tutoring system (ITS) in light of developments in artificial intelligence. A specialized intelligent operating system (SIOS) is proposed for software for a network of microcomputers, and it is postulated that a general learning system must be used as a basic framework for the SIOS. (Author/LRW)

  13. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  14. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  15. Computer Assisted Advising Tool (CAAT).

    Science.gov (United States)

    Matsen, Marie E.

    Lane Community College's Computer Assisted Advising Tool (CAAT) is used by counselors to assist students in developing a plan for the completion of a degree or certificate. CAAT was designed to facilitate student advisement from matriculation to graduation by comparing degree requirements with the courses completed by students. Three major sources…

  16. Intelligent agents in data-intensive computing

    CERN Document Server

    Correia, Luís; Molina, José

    2016-01-01

    This book presents new approaches that advance research in all aspects of agent-based models, technologies, simulations and implementations for data intensive applications. The nine chapters contain a review of recent cross-disciplinary approaches in cloud environments and multi-agent systems, and important formulations of data intensive problems in distributed computational environments together with the presentation of new agent-based tools to handle those problems and Big Data in general. This volume can serve as a reference for students, researchers and industry practitioners working in or interested in joining interdisciplinary work in the areas of data intensive computing and Big Data systems using emergent large-scale distributed computing paradigms. It will also allow newcomers to grasp key concepts and potential solutions on advanced topics of theory, models, technologies, system architectures and implementation of applications in Multi-Agent systems and data intensive computing. .

  17. A study of an intelligent FME system for SFCR tools

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, H.A., E-mail: Hassan.hassan@opg.com [Ontario Power Generation, Toronto, Ontario (Canada)

    2008-07-01

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the

  18. A study of an intelligent FME system for SFCR tools

    International Nuclear Information System (INIS)

    Hassan, H.A.

    2008-01-01

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the system

  19. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  20. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  1. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  2. Computational Intelligence in Early Diabetes Diagnosis: A Review

    Science.gov (United States)

    Shankaracharya; Odedra, Devang; Samanta, Subir; Vidyarthi, Ambarish S.

    2010-01-01

    The development of an effective diabetes diagnosis system by taking advantage of computational intelligence is regarded as a primary goal nowadays. Many approaches based on artificial network and machine learning algorithms have been developed and tested against diabetes datasets, which were mostly related to individuals of Pima Indian origin. Yet, despite high accuracies of up to 99% in predicting the correct diabetes diagnosis, none of these approaches have reached clinical application so far. One reason for this failure may be that diabetologists or clinical investigators are sparsely informed about, or trained in the use of, computational diagnosis tools. Therefore, this article aims at sketching out an outline of the wide range of options, recent developments, and potentials in machine learning algorithms as diabetes diagnosis tools. One focus is on supervised and unsupervised methods, which have made significant impacts in the detection and diagnosis of diabetes at primary and advanced stages. Particular attention is paid to algorithms that show promise in improving diabetes diagnosis. A key advance has been the development of a more in-depth understanding and theoretical analysis of critical issues related to algorithmic construction and learning theory. These include trade-offs for maximizing generalization performance, use of physically realistic constraints, and incorporation of prior knowledge and uncertainty. The review presents and explains the most accurate algorithms, and discusses advantages and pitfalls of methodologies. This should provide a good resource for researchers from all backgrounds interested in computational intelligence-based diabetes diagnosis methods, and allows them to extend their knowledge into this kind of research. PMID:21713313

  3. Computational intelligence in early diabetes diagnosis: a review.

    Science.gov (United States)

    Shankaracharya; Odedra, Devang; Samanta, Subir; Vidyarthi, Ambarish S

    2010-01-01

    The development of an effective diabetes diagnosis system by taking advantage of computational intelligence is regarded as a primary goal nowadays. Many approaches based on artificial network and machine learning algorithms have been developed and tested against diabetes datasets, which were mostly related to individuals of Pima Indian origin. Yet, despite high accuracies of up to 99% in predicting the correct diabetes diagnosis, none of these approaches have reached clinical application so far. One reason for this failure may be that diabetologists or clinical investigators are sparsely informed about, or trained in the use of, computational diagnosis tools. Therefore, this article aims at sketching out an outline of the wide range of options, recent developments, and potentials in machine learning algorithms as diabetes diagnosis tools. One focus is on supervised and unsupervised methods, which have made significant impacts in the detection and diagnosis of diabetes at primary and advanced stages. Particular attention is paid to algorithms that show promise in improving diabetes diagnosis. A key advance has been the development of a more in-depth understanding and theoretical analysis of critical issues related to algorithmic construction and learning theory. These include trade-offs for maximizing generalization performance, use of physically realistic constraints, and incorporation of prior knowledge and uncertainty. The review presents and explains the most accurate algorithms, and discusses advantages and pitfalls of methodologies. This should provide a good resource for researchers from all backgrounds interested in computational intelligence-based diabetes diagnosis methods, and allows them to extend their knowledge into this kind of research.

  4. IT-tool Concept for Design and Intelligent Motion Control

    DEFF Research Database (Denmark)

    Conrad, Finn; Hansen, Poul Erik; Sørensen, Torben

    2000-01-01

    The paper presents results obtained from a Danish mechatronic research program focusing on intelligent motion control as well as results from the Esprit project SWING on IT-tools for rapid prototyping of fluid power components and systems. A mechatronic test facility with digital controllers for ....... Furthermore, a developed IT-tool concept for controller and system design utilising the ISO 10303 STEP Standard is proposed....

  5. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  6. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  7. Analysis of optoelectronic strategic planning in Taiwan by artificial intelligence portfolio tool

    Science.gov (United States)

    Chang, Rang-Seng

    1992-05-01

    Taiwan ROC has achieved significant advances in the optoelectronic industry with some Taiwan products ranked high in the world market and technology. Six segmentations of optoelectronic were planned. Each one was divided into several strategic items, design artificial intelligent portfolio tool (AIPT) to analyze the optoelectronic strategic planning in Taiwan. The portfolio is designed to provoke strategic thinking intelligently. This computer- generated strategy should be selected and modified by the individual. Some strategies for the development of the Taiwan optoelectronic industry also are discussed in this paper.

  8. Intelligent decision support systems for sustainable computing paradigms and applications

    CERN Document Server

    Abraham, Ajith; Siarry, Patrick; Sheng, Michael

    2017-01-01

    This unique book dicusses the latest research, innovative ideas, challenges and computational intelligence (CI) solutions in sustainable computing. It presents novel, in-depth fundamental research on achieving a sustainable lifestyle for society, either from a methodological or from an application perspective. Sustainable computing has expanded to become a significant research area covering the fields of computer science and engineering, electrical engineering and other engineering disciplines, and there has been an increase in the amount of literature on aspects sustainable computing such as energy efficiency and natural resources conservation that emphasizes the role of ICT (information and communications technology) in achieving system design and operation objectives. The energy impact/design of more efficient IT infrastructures is a key challenge in realizing new computing paradigms. The book explores the uses of computational intelligence (CI) techniques for intelligent decision support that can be explo...

  9. Intelligent Tools for Building a Scientific Information Platform

    CERN Document Server

    Skonieczny, Lukasz; Rybiński, Henryk; Niezgodka, Marek

    2012-01-01

    This book is a selection of results obtained within one year of research performed under SYNAT - a nation-wide scientific project aiming to create an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The selection refers to the research in artificial intelligence, knowledge discovery and data mining, information retrieval and natural language processing, addressing the problems of implementing intelligent tools for building a scientific information platform. The idea of this book is based on the very successful SYNAT Project Conference and the SYNAT Workshop accompanying the 19th International Symposium on Methodologies for Intelligent Systems (ISMIS 2011). The papers included in this book present an overview and insight into such topics as architecture of scientific information platforms, semantic clustering, ontology-based systems, as well as, multimedia data processing.

  10. Laser formed intentional firearm microstamping technology: counterinsurgency intelligence gathering tool

    Science.gov (United States)

    Lizotte, Todd E.; Ohar, Orest P.

    2009-09-01

    Warfare relies on effective, accurate and timely intelligence an especially critical task when conducting a counterinsurgency operation [1]. Simply stated counterinsurgency is an intelligence war. Both insurgents and counterinsurgents need effective intelligence capabilities to be successful. Insurgents and counterinsurgents therefore attempt to create and maintain intelligence networks and fight continuously to neutralize each other's intelligence capabilities [1][2]. In such an environment it is obviously an advantage to target or proactively create opportunities to track and map an insurgent movement. Quickly identifying insurgency intelligence assets (Infiltrators) within a host government's infrastructure is the goal. Infiltrators can occupy various areas of government such as security personnel, national police force, government offices or military units. Intentional Firearm Microstamping offers such opportunities when implemented into firearms. Outfitted within firearms purchased and distributed to the host nation's security forces (civilian and military), Intentional Firearm Microstamping (IFM) marks bullet cartridge casings with codes as they are fired from the firearm. IFM is incorporated onto optimum surfaces with the firearm mechanism. The intentional microstamp tooling marks can take the form of alphanumeric codes or encoded geometric codes that identify the firearm. As the firearm is discharged the intentional tooling marks transfer a code to the cartridge casing which is ejected out of the firearm. When recovered at the scene of a firefight or engagement, the technology will provide forensic intelligence allowing the mapping and tracking of small arms traffic patterns within the host nation or identify insurgency force strength and pinpoint firearm sources, such as corrupt/rogue military units or police force. Intentional Firearm Microstamping is a passive mechanical trace technology that can be outfitted or retrofitted to semiautomatic handguns and

  11. International Conference on Computational Intelligence in Data Mining

    CERN Document Server

    Mohapatra, Durga

    2017-01-01

    The book presents high quality papers presented at the International Conference on Computational Intelligence in Data Mining (ICCIDM 2016) organized by School of Computer Engineering, Kalinga Institute of Industrial Technology (KIIT), Bhubaneswar, Odisha, India during December 10 – 11, 2016. The book disseminates the knowledge about innovative, active research directions in the field of data mining, machine and computational intelligence, along with current issues and applications of related topics. The volume aims to explicate and address the difficulties and challenges that of seamless integration of the two core disciplines of computer science. .

  12. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  13. Computational intelligence for technology enhanced learning

    Energy Technology Data Exchange (ETDEWEB)

    Xhafa, Fatos [Polytechnic Univ. of Catalonia, Barcelona (Spain). Dept. of Languages and Informatics Systems; Caballe, Santi; Daradoumis, Thanasis [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Computer Sciences Multimedia and Telecommunications; Abraham, Ajith [Machine Intelligence Research Labs (MIR Labs), Auburn, WA (United States). Scientific Network for Innovation and Research Excellence; Juan Perez, Angel Alejandro (eds.) [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Information Sciences

    2010-07-01

    E-Learning has become one of the most wide spread ways of distance teaching and learning. Technologies such as Web, Grid, and Mobile and Wireless networks are pushing teaching and learning communities to find new and intelligent ways of using these technologies to enhance teaching and learning activities. Indeed, these new technologies can play an important role in increasing the support to teachers and learners, to shorten the time to learning and teaching; yet, it is necessary to use intelligent techniques to take advantage of these new technologies to achieve the desired support to teachers and learners and enhance learners' performance in distributed learning environments. The chapters of this volume bring advances in using intelligent techniques for technology enhanced learning as well as development of e-Learning applications based on such techniques and supported by technology. Such intelligent techniques include clustering and classification for personalization of learning, intelligent context-aware techniques, adaptive learning, data mining techniques and ontologies in e-Learning systems, among others. Academics, scientists, software developers, teachers and tutors and students interested in e-Learning will find this book useful for their academic, research and practice activity. (orig.)

  14. Computer Vision for Artificially Intelligent Robotic Systems

    Science.gov (United States)

    Ma, Chialo; Ma, Yung-Lung

    1987-04-01

    In this paper An Acoustic Imaging Recognition System (AIRS) will be introduced which is installed on an Intelligent Robotic System and can recognize different type of Hand tools' by Dynamic pattern recognition. The dynamic pattern recognition is approached by look up table method in this case, the method can save a lot of calculation time and it is practicable. The Acoustic Imaging Recognition System (AIRS) is consist of four parts -- position control unit, pulse-echo signal processing unit, pattern recognition unit and main control unit. The position control of AIRS can rotate an angle of ±5 degree Horizental and Vertical seperately, the purpose of rotation is to find the maximum reflection intensity area, from the distance, angles and intensity of the target we can decide the characteristic of this target, of course all the decision is target, of course all the decision is processed bye the main control unit. In Pulse-Echo Signal Process Unit, we ultilize the correlation method, to overcome the limitation of short burst of ultrasonic, because the Correlation system can transmit large time bandwidth signals and obtain their resolution and increased intensity through pulse compression in the correlation receiver. The output of correlator is sampled and transfer into digital data by u law coding method, and this data together with delay time T, angle information OH, eV will be sent into main control unit for further analysis. The recognition process in this paper, we use dynamic look up table method, in this method at first we shall set up serval recognition pattern table and then the new pattern scanned by Transducer array will be devided into serval stages and compare with the sampling table. The comparison is implemented by dynamic programing and Markovian process. All the hardware control signals, such as optimum delay time for correlator receiver, horizental and vertical rotation angle for transducer plate, are controlled by the Main Control Unit, the Main

  15. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  16. 4th International Conference on Frontiers in Intelligent Computing : Theory and Applications

    CERN Document Server

    Pal, Tandra; Kar, Samarjit; Satapathy, Suresh; Mandal, Jyotsna

    2016-01-01

    The proceedings of the 4th International Conference on Frontiers in Intelligent Computing: Theory and Applications 2015 (FICTA 2015) serves as the knowledge centre not only for scientists and researchers in the field of intelligent computing but also for students of post-graduate level in various engineering disciplines. The book covers a comprehensive overview of the theory, methods, applications and tools of Intelligent Computing. Researchers are now working in interdisciplinary areas and the proceedings of FICTA 2015 plays a major role to accumulate those significant works in one arena. The chapters included in the proceedings inculcates both theoretical as well as practical aspects of different areas like Nature Inspired Algorithms, Fuzzy Systems, Data Mining, Signal Processing, Image processing, Text Processing, Wireless Sensor Networks, Network Security and Cellular Automata. .

  17. 4th International Joint Conference on Computational Intelligence

    CERN Document Server

    Correia, António; Rosa, Agostinho; Filipe, Joaquim

    2015-01-01

    The present book includes extended and revised versions of a set of selected papers from the Fourth International Joint Conference on Computational Intelligence (IJCCI 2012)., held in Barcelona, Spain, from 5 to 7 October, 2012. The conference was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was organized in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The conference brought together researchers, engineers and practitioners in computational technologies, especially those related to the areas of fuzzy computation, evolutionary computation and neural computation. It is composed of three co-located conferences, each one specialized in one of the aforementioned -knowledge areas. Namely: - International Conference on Evolutionary Computation Theory and Applications (ECTA) - International Conference on Fuzzy Computation Theory and Applications (FCTA) - International Conference on Neural Computation Theory a...

  18. Correlation between crystallographic computing and artificial intelligence research

    Energy Technology Data Exchange (ETDEWEB)

    Feigenbaum, E A [Stanford Univ., CA; Engelmore, R S; Johnson, C K

    1977-01-01

    Artificial intelligence research, as a part of computer science, has produced a variety of programs of experimental and applications interest: programs for scientific inference, chemical synthesis, planning robot control, extraction of meaning from English sentences, speech understanding, interpretation of visual images, and so on. The symbolic manipulation techniques used in artificial intelligence provide a framework for analyzing and coding the knowledge base of a problem independently of an algorithmic implementation. A possible application of artificial intelligence methodology to protein crystallography is described. 2 figures, 2 tables.

  19. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  20. 2nd International Conference on Computational Intelligence in Data Mining

    CERN Document Server

    Mohapatra, Durga

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the Second International Conference on Computational Intelligence in Data Mining (ICCIDM 2015) held at Bhubaneswar, Odisha, India during 5 – 6 December 2015. The two-volume Proceedings address the difficulties and challenges for the seamless integration of two core disciplines of computer science, i.e., computational intelligence and data mining. The book addresses different methods and techniques of integration for enhancing the overall goal of data mining. The book helps to disseminate the knowledge about some innovative, active research directions in the field of data mining, machine and computational intelligence, along with some current issues and applications of related topics.

  1. Computational intelligence for decision support in cyber-physical systems

    CERN Document Server

    Ali, A; Riaz, Zahid

    2014-01-01

    This book is dedicated to applied computational intelligence and soft computing techniques with special reference to decision support in Cyber Physical Systems (CPS), where the physical as well as the communication segment of the networked entities interact with each other. The joint dynamics of such systems result in a complex combination of computers, software, networks and physical processes all combined to establish a process flow at system level. This volume provides the audience with an in-depth vision about how to ensure dependability, safety, security and efficiency in real time by making use of computational intelligence in various CPS applications ranging from the nano-world to large scale wide area systems of systems. Key application areas include healthcare, transportation, energy, process control and robotics where intelligent decision support has key significance in establishing dynamic, ever-changing and high confidence future technologies. A recommended text for graduate students and researche...

  2. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2013-01-01

    The volume contains the papers presented at FICTA 2012: International Conference on Frontiers in Intelligent Computing: Theory and Applications held on December 22-23, 2012 in Bhubaneswar engineering College, Bhubaneswar, Odissa, India. It contains 86 papers contributed by authors from the globe. These research papers mainly focused on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, image processing, cloud computing, networking etc.

  3. Intelligent computational control of multi-fingered dexterous robotic hand

    OpenAIRE

    Chen, Disi; Li, Gongfa; Jiang, Guozhang; Fang, Yinfeng; Ju, Zhaojie; Liu, Honghai

    2015-01-01

    We discuss the intelligent computational control theory and introduce the hardware structure of HIT/DLR II dexterous robotic hand, which is the typical dexterous robotic hand. We show that how DSP or FPGA controller can be used in the dexterous robotic hand. A popular intelligent dexterous robotic hand control system, which named Electromyography (EMG) control is investigated. We introduced some mathematical algorithms in EMG controlling, such as Gauss mixture model (GMM), artificial neural n...

  4. Artificial and Computational Intelligence for Games on Mobile Platforms

    OpenAIRE

    Congdon, Clare Bates; Hingston, Philip; Kendall, Graham

    2013-01-01

    In this chapter, we consider the possibilities of creating new and innovative games that are targeted for mobile devices, such as smart phones and tablets, and that showcase AI (Artificial Intelligence) and CI (Computational Intelligence) approaches. Such games might take advantage of the sensors and facilities that are not available on other platforms, or might simply rely on the "app culture" to facilitate getting the games into users' hands. While these games might be profitable in themsel...

  5. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  6. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  7. 5th International Conference on Computational Collective Intelligence

    CERN Document Server

    Trawinski, Bogdan; Nguyen, Ngoc

    2014-01-01

    The book consists of 19 extended and revised chapters based on original works presented during a poster session organized within the 5th International Conference on Computational Collective Intelligence that was held between 11 and 13 of September 2013 in Craiova, Romania. The book is divided into three parts. The first part is titled “Agents and Multi-Agent Systems” and consists of 8 chapters that concentrate on many problems related to agent and multi-agent systems, including: formal models, agent autonomy, emergent properties, agent programming, agent-based simulation and planning. The second part of the book is titled “Intelligent Computational Methods” and consists of 6 chapters. The authors present applications of various intelligent computational methods like neural networks, mathematical optimization and multistage decision processes in areas like cooperation, character recognition, wireless networks, transport, and metal structures. The third part of the book is titled “Language and Knowled...

  8. 9th International conference on distributed computing and artificial intelligence

    CERN Document Server

    Santana, Juan; González, Sara; Molina, Jose; Bernardos, Ana; Rodríguez, Juan; DCAI 2012; International Symposium on Distributed Computing and Artificial Intelligence 2012

    2012-01-01

    The International Symposium on Distributed Computing and Artificial Intelligence 2012 (DCAI 2012) is a stimulating and productive forum where the scientific community can work towards future cooperation in Distributed Computing and Artificial Intelligence areas. This conference is a forum in which  applications of innovative techniques for solving complex problems will be presented. Artificial intelligence is changing our society. Its application in distributed environments, such as the internet, electronic commerce, environment monitoring, mobile communications, wireless devices, distributed computing, to mention only a few, is continuously increasing, becoming an element of high added value with social and economic potential, in industry, quality of life, and research. These technologies are changing constantly as a result of the large research and technical effort being undertaken in both universities and businesses. The exchange of ideas between scientists and technicians from both the academic and indus...

  9. 1st International Conference on Computational Intelligence in Data Mining

    CERN Document Server

    Behera, Himansu; Mandal, Jyotsna; Mohapatra, Durga

    2015-01-01

    The contributed volume aims to explicate and address the difficulties and challenges for the seamless integration of two core disciplines of computer science, i.e., computational intelligence and data mining. Data Mining aims at the automatic discovery of underlying non-trivial knowledge from datasets by applying intelligent analysis techniques. The interest in this research area has experienced a considerable growth in the last years due to two key factors: (a) knowledge hidden in organizations’ databases can be exploited to improve strategic and managerial decision-making; (b) the large volume of data managed by organizations makes it impossible to carry out a manual analysis. The book addresses different methods and techniques of integration for enhancing the overall goal of data mining. The book helps to disseminate the knowledge about some innovative, active research directions in the field of data mining, machine and computational intelligence, along with some current issues and applications of relate...

  10. Distributed computing and artificial intelligence : 10th International Conference

    CERN Document Server

    Neves, José; Rodriguez, Juan; Santana, Juan; Gonzalez, Sara

    2013-01-01

    The International Symposium on Distributed Computing and Artificial Intelligence 2013 (DCAI 2013) is a forum in which applications of innovative techniques for solving complex problems are presented. Artificial intelligence is changing our society. Its application in distributed environments, such as the internet, electronic commerce, environment monitoring, mobile communications, wireless devices, distributed computing, to mention only a few, is continuously increasing, becoming an element of high added value with social and economic potential, in industry, quality of life, and research. This conference is a stimulating and productive forum where the scientific community can work towards future cooperation in Distributed Computing and Artificial Intelligence areas. These technologies are changing constantly as a result of the large research and technical effort being undertaken in both universities and businesses. The exchange of ideas between scientists and technicians from both the academic and industry se...

  11. Artificial intelligence, expert systems, computer vision, and natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  12. SOME PARADIGMS OF ARTIFICIAL INTELLIGENCE IN FINANCIAL COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2015-12-01

    Full Text Available The article discusses some paradigms of artificial intelligence in the context of their applications in computer financial systems. The proposed approach has a significant po-tential to increase the competitiveness of enterprises, including financial institutions. However, it requires the effective use of supercomputers, grids and cloud computing. A reference is made to the computing environment for Bitcoin. In addition, we characterized genetic programming and artificial neural networks to prepare investment strategies on the stock exchange market.

  13. SOME PARADIGMS OF ARTIFICIAL INTELLIGENCE IN FINANCIAL COMPUTER SYSTEMS

    OpenAIRE

    Jerzy Balicki

    2015-01-01

    The article discusses some paradigms of artificial intelligence in the context of their applications in computer financial systems. The proposed approach has a significant po-tential to increase the competitiveness of enterprises, including financial institutions. However, it requires the effective use of supercomputers, grids and cloud computing. A reference is made to the computing environment for Bitcoin. In addition, we characterized genetic programming and artificial neural networks to p...

  14. Computer-aided diagnosis and artificial intelligence in clinical imaging.

    Science.gov (United States)

    Shiraishi, Junji; Li, Qiang; Appelbaum, Daniel; Doi, Kunio

    2011-11-01

    Computer-aided diagnosis (CAD) is rapidly entering the radiology mainstream. It has already become a part of the routine clinical work for the detection of breast cancer with mammograms. The computer output is used as a "second opinion" in assisting radiologists' image interpretations. The computer algorithm generally consists of several steps that may include image processing, image feature analysis, and data classification via the use of tools such as artificial neural networks (ANN). In this article, we will explore these and other current processes that have come to be referred to as "artificial intelligence." One element of CAD, temporal subtraction, has been applied for enhancing interval changes and for suppressing unchanged structures (eg, normal structures) between 2 successive radiologic images. To reduce misregistration artifacts on the temporal subtraction images, a nonlinear image warping technique for matching the previous image to the current one has been developed. Development of the temporal subtraction method originated with chest radiographs, with the method subsequently being applied to chest computed tomography (CT) and nuclear medicine bone scans. The usefulness of the temporal subtraction method for bone scans was demonstrated by an observer study in which reading times and diagnostic accuracy improved significantly. An additional prospective clinical study verified that the temporal subtraction image could be used as a "second opinion" by radiologists with negligible detrimental effects. ANN was first used in 1990 for computerized differential diagnosis of interstitial lung diseases in CAD. Since then, ANN has been widely used in CAD schemes for the detection and diagnosis of various diseases in different imaging modalities, including the differential diagnosis of lung nodules and interstitial lung diseases in chest radiography, CT, and position emission tomography/CT. It is likely that CAD will be integrated into picture archiving and

  15. Distributed Computing and Artificial Intelligence, 12th International Conference

    CERN Document Server

    Malluhi, Qutaibah; Gonzalez, Sara; Bocewicz, Grzegorz; Bucciarelli, Edgardo; Giulioni, Gianfranco; Iqba, Farkhund

    2015-01-01

    The 12th International Symposium on Distributed Computing and Artificial Intelligence 2015 (DCAI 2015) is a forum to present applications of innovative techniques for studying and solving complex problems. The exchange of ideas between scientists and technicians from both the academic and industrial sector is essential to facilitate the development of systems that can meet the ever-increasing demands of today’s society. The present edition brings together past experience, current work and promising future trends associated with distributed computing, artificial intelligence and their application in order to provide efficient solutions to real problems. This symposium is organized by the Osaka Institute of Technology, Qatar University and the University of Salamanca.

  16. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  17. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we c......The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities...... are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI. In the paper, we consider a number of Extract‐Transform‐Load (ETL) tools, database management systems (DBMSs), On‐Line Analytical Processing (OLAP) servers, and OLAP clients. We find that, unlike the situation a few years ago, there now...

  18. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  19. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  20. ECO-INTELLIGENT TOOLS – A NECESSITY FOR SUSTAINABLE BUSINESSES

    Directory of Open Access Journals (Sweden)

    S. Nate

    2014-04-01

    Full Text Available Many of the challenges associated with sustainable development can be traced in the way modern society produces and consumes. Production, distribution and supply of goods and services require material and energy consumption, having an impact on natural resources both quantitatively and qualitatively, generating waste, pollution and disrupting ecosystems. Ecobusiness intelligence is the capacity of people, processes and applications / tools to organize business information, to facilitate consistent access to them and analyse them in order to improve management decisions, for better performance management of the organizations that are increasingly pressed to synchronize their processes and services with a sustainable development agenda, through the development, testing and implementation of decision support software. By adopting sustainable practices, eco – intelligent companies can gain added value, increase market share and boost shareholder value. Moreover, the growing demand for "green" products has created new markets and the visionary entrepreneurs already reap the rewards of approaching sustainability. Large and small companies are learning that sustainable business practices not only help the environment but also can improve profitability by pursuing higher efficiency, fewer harmful side-effects, and better relationships with the community and more. Gaining competitive advantage is a core concern of the companies and the existence of systems of identification, extraction and analysis of available data in a company, but also from the external environment, to provide real support for business decisions, is an essential ingredient of success. This paper highlights the necessity of eco-intelligent tools that help determining the organization's strategies, identifying the perceptions and capabilities of the competitors, analyzing the effectiveness of current operations, deploying long-term prospects for environmental action and establishing

  1. Information security system quality assessment through the intelligent tools

    Science.gov (United States)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  2. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  3. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  4. Cloud Computing Boosts Business Intelligence of Telecommunication Industry

    Science.gov (United States)

    Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling

    Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.

  5. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  6. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  7. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    Science.gov (United States)

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  8. An Intelligent Computer Assisted Language Learning System for Arabic Learners

    Science.gov (United States)

    Shaalan, Khaled F.

    2005-01-01

    This paper describes the development of an intelligent computer-assisted language learning (ICALL) system for learning Arabic. This system could be used for learning Arabic by students at primary schools or by learners of Arabic as a second or foreign language. It explores the use of Natural Language Processing (NLP) techniques for learning…

  9. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  10. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  11. Third International Joint Conference on Computational Intelligence (IJCCI 2011)

    CERN Document Server

    Dourado, António; Rosa, Agostinho; Filipe, Joaquim; Computational Intelligence

    2013-01-01

    The present book includes a set of selected extended papers from the third International Joint Conference on Computational Intelligence (IJCCI 2011), held in Paris, France, from 24 to 26 October 2011. The conference was composed of three co-located conferences:  The International Conference on Fuzzy Computation (ICFC), the International Conference on Evolutionary Computation (ICEC), and the International Conference on Neural Computation (ICNC). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 283 submissions, from 59 countries, in all continents. This book includes the revised and extended versions of a strict selection of the best papers presented at the conference.

  12. 7th International Joint Conference on Computational Intelligence

    CERN Document Server

    Rosa, Agostinho; Cadenas, José; Correia, António; Madani, Kurosh; Ruano, António; Filipe, Joaquim

    2017-01-01

    This book includes a selection of revised and extended versions of the best papers from the seventh International Joint Conference on Computational Intelligence (IJCCI 2015), held in Lisbon, Portugal, from 12 to 14 November 2015, which was composed of three co-located conferences: The International Conference on Evolutionary Computation Theory and Applications (ECTA), the International Conference on Fuzzy Computation Theory and Applications (FCTA), and the International Conference on Neural Computation Theory and Applications (NCTA). The book presents recent advances in scientific developments and applications in these three areas, reflecting the IJCCI’s commitment to high quality standards.

  13. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  14. International conference on Advances in Intelligent Control and Innovative Computing

    CERN Document Server

    Castillo, Oscar; Huang, Xu; Intelligent Control and Innovative Computing

    2012-01-01

    In the lightning-fast world of intelligent control and cutting-edge computing, it is vitally important to stay abreast of developments that seem to follow each other without pause. This publication features the very latest and some of the very best current research in the field, with 32 revised and extended research articles written by prominent researchers in the field. Culled from contributions to the key 2011 conference Advances in Intelligent Control and Innovative Computing, held in Hong Kong, the articles deal with a wealth of relevant topics, from the most recent work in artificial intelligence and decision-supporting systems, to automated planning, modelling and simulation, signal processing, and industrial applications. Not only does this work communicate the current state of the art in intelligent control and innovative computing, it is also an illuminating guide to up-to-date topics for researchers and graduate students in the field. The quality of the contents is absolutely assured by the high pro...

  15. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  16. Computational neuroscience for advancing artificial intelligence

    Directory of Open Access Journals (Sweden)

    Fernando P. Ponce

    2011-07-01

    Full Text Available resumen del libro de Alonso, E. y Mondragón, E. (2011. Hershey, NY: Medical Information Science Reference. La neurociencia como disciplinapersigue el entendimiento del cerebro y su relación con el funcionamiento de la mente a través del análisis de la comprensión de la interacción de diversos procesos físicos, químicos y biológicos (Bassett & Gazzaniga, 2011. Por otra parte, numerosas disciplinasprogresivamente han realizado significativas contribuciones en esta empresa tales como la matemática, la psicología o la filosofía, entre otras. Producto de este esfuerzo, es que junto con la neurociencia tradicional han aparecido disciplinas complementarias como la neurociencia cognitiva, la neuropsicología o la neurocienciacomputacional (Bengio, 2007; Dayan & Abbott, 2005. En el contexto de la neurociencia computacional como disciplina complementaria a laneurociencia tradicional. Alonso y Mondragón (2011 editan el libroComputacional Neuroscience for Advancing Artificial Intelligence: Models, Methods and Applications.

  17. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Intelligent Buildings and pervasive computing - research perspectives and discussions

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Krogh, Peter Gall; Kyng, Morten

    2001-01-01

    computers are everywhere, for everyone, at all times. Where IT becomes a still more integrated part of our environments with processors, sensors, and actuators connected via high-speed networks and combined with new visualization devices ranging from projections directly in the eye to large panorama......Intelligent Buildings have been the subject of research and commercial interest for more than two decades. The different perspectives range from monitoring and controlling energy consumption over interactive rooms supporting work in offices and leisure in the home, to buildings providing...... information to by-passers in plazas and urban environments. This paper puts forward the hypothesis that the coming decade will witness a dramatic increase in both quality and quantity of intelligent buildings due to the emerging field of pervasive computing: the next generation computing environments where...

  19. Secure data exchange between intelligent devices and computing centers

    Science.gov (United States)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  20. [INVITED] Computational intelligence for smart laser materials processing

    Science.gov (United States)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  1. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  2. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  3. CATO: a CAD tool for intelligent design of optical networks and interconnects

    Science.gov (United States)

    Chlamtac, Imrich; Ciesielski, Maciej; Fumagalli, Andrea F.; Ruszczyk, Chester; Wedzinga, Gosse

    1997-10-01

    Increasing communication speed requirements have created a great interest in very high speed optical and all-optical networks and interconnects. The design of these optical systems is a highly complex task, requiring the simultaneous optimization of various parts of the system, ranging from optical components' characteristics to access protocol techniques. Currently there are no computer aided design (CAD) tools on the market to support the interrelated design of all parts of optical communication systems, thus the designer has to rely on costly and time consuming testbed evaluations. The objective of the CATO (CAD tool for optical networks and interconnects) project is to develop a prototype of an intelligent CAD tool for the specification, design, simulation and optimization of optical communication networks. CATO allows the user to build an abstract, possible incomplete, model of the system, and determine its expected performance. Based on design constraints provided by the user, CATO will automatically complete an optimum design, using mathematical programming techniques, intelligent search methods and artificial intelligence (AI). Initial design and testing of a CATO prototype (CATO-1) has been completed recently. The objective was to prove the feasibility of combining AI techniques, simulation techniques, an optical device library and a graphical user interface into a flexible CAD tool for obtaining optimal communication network designs in terms of system cost and performance. CATO-1 is an experimental tool for designing packet-switching wavelength division multiplexing all-optical communication systems using a LAN/MAN ring topology as the underlying network. The two specific AI algorithms incorporated are simulated annealing and a genetic algorithm. CATO-1 finds the optimal number of transceivers for each network node, using an objective function that includes the cost of the devices and the overall system performance.

  4. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  5. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  6. 11th International Conference on Distributed Computing and Artificial Intelligence

    CERN Document Server

    Bersini, Hugues; Corchado, Juan; Rodríguez, Sara; Pawlewski, Paweł; Bucciarelli, Edgardo

    2014-01-01

    The 11th International Symposium on Distributed Computing and Artificial Intelligence 2014 (DCAI 2014) is a forum to present applications of innovative techniques for studying and solving complex problems. The exchange of ideas between scientists and technicians from both the academic and industrial sector is essential to facilitate the development of systems that can meet the ever-increasing demands of today’s society. The present edition brings together past experience, current work and promising future trends associated with distributed computing, artificial intelligence and their application in order to provide efficient solutions to real problems. This year’s technical program presents both high quality and diversity, with contributions in well-established and evolving areas of research (Algeria, Brazil, China, Croatia, Czech Republic, Denmark, France, Germany, Ireland, Italy, Japan, Malaysia, Mexico, Poland, Portugal, Republic of Korea, Spain, Taiwan, Tunisia, Ukraine, United Kingdom), representing ...

  7. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  8. The ethical intelligence: a tool guidance in the process of the negotiation

    Directory of Open Access Journals (Sweden)

    Cristina Seijo

    2014-08-01

    Full Text Available The present article is the result of a research, which has as object present a theoretical contrast that invites to the reflection on the ethical intelligence as a tool guidance in the negotiation. In the same one there are approached the different types of ethical intelligence; spatial intelligence, rational intelligence, emotional intelligence among others, equally one refers associative intelligence to the processes of negotiation and to the tactics of negotiation. In this respect, it is possible to deal to the ethical intelligence as the aptitude to examine the moral standards of the individual and of the society to decide between what this one correct or incorrect and to be able like that to solve the different problematic ones for which an individual or a society cross. For this reason, one invites to start mechanisms of transparency and participation by virtue of which the ethical intelligence is born in mind as the threshold that orientates this process of negotiation. 

  9. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  10. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  11. Computational Intelligence in Highway Management: A Review

    Directory of Open Access Journals (Sweden)

    Ondrej Pribyl

    2015-10-01

    Full Text Available Highway management systems are used to improve safety and driving comfort on highways by using control strategies and providing information and warnings to drivers. They use several strategies starting from speed and lane management, through incident detection and warning systems, ramp metering, weather information up to, for example, informing drivers about alternative roads. This paper provides a review of the existing approaches to highway management systems, particularly speed harmonization and ramp metering. It is focused only on modern and advanced approaches, such as soft computing, multi-agent methods and their interconnection. Its objective is to provide guidance in the wide field of highway management and to point out the most relevant recent activities which demonstrate that development in the field of highway management is still important and that the existing research exhibits potential for further enhancement.

  12. Expertik: Experience with Artificial Intelligence and Mobile Computing

    Directory of Open Access Journals (Sweden)

    José Edward Beltrán Lozano

    2013-06-01

    Full Text Available This article presents the experience in the development of services based in Artificial Intelligence, Service Oriented Architecture, mobile computing. It aims to combine technology offered by mobile computing provides techniques and artificial intelligence through a service provide diagnostic solutions to problems in industrial maintenance. It aims to combine technology offered by mobile computing and the techniques artificial intelligence through a service to provide diagnostic solutions to problems in industrial maintenance. For service creation are identified the elements of an expert system, the knowledge base, the inference engine and knowledge acquisition interfaces and their consultation. The applications were developed in ASP.NET under architecture three layers. The data layer was developed conjunction in SQL Server with data management classes; business layer in VB.NET and the presentation layer in ASP.NET with XHTML. Web interfaces for knowledge acquisition and query developed in Web and Mobile Web. The inference engine was conducted in web service developed for the fuzzy logic model to resolve requests from applications consulting knowledge (initially an exact rule-based logic within this experience to resolve requests from applications consulting knowledge. This experience seeks to strengthen a technology-based company to offer services based on AI for service companies Colombia.

  13. Artificial intelligence. Application of the Statistical Neural Networks computer program in nuclear medicine

    International Nuclear Information System (INIS)

    Stefaniak, B.; Cholewinski, W.; Tarkowska, A.

    2005-01-01

    Artificial Neural Networks (ANN) may be a tool alternative and complementary to typical statistical analysis. However, in spite of many computer application of various ANN algorithms ready for use, artificial intelligence is relatively rarely applied to data processing. In this paper practical aspects of scientific application of ANN in medicine using the Statistical Neural Networks Computer program, were presented. Several steps of data analysis with the above ANN software package were discussed shortly, from material selection and its dividing into groups to the types of obtained results. The typical problems connected with assessing scintigrams by ANN were also described. (author)

  14. Engineering Courses on Computational Thinking Through Solving Problems in Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Piyanuch Silapachote

    2017-09-01

    Full Text Available Computational thinking sits at the core of every engineering and computing related discipline. It has increasingly emerged as its own subject in all levels of education. It is a powerful cornerstone for cognitive development, creative problem solving, algorithmic thinking and designs, and programming. How to effectively teach computational thinking skills poses real challenges and creates opportunities. Targeting entering computer science and engineering undergraduates, we resourcefully integrate elements from artificial intelligence (AI into introductory computing courses. In addition to comprehension of the essence of computational thinking, practical exercises in AI enable inspirations of collaborative problem solving beyond abstraction, logical reasoning, critical and analytical thinking. Problems in machine intelligence systems intrinsically connect students to algorithmic oriented computing and essential mathematical foundations. Beyond knowledge representation, AI fosters a gentle introduction to data structures and algorithms. Focused on engaging mental tool, a computer is never a necessity. Neither coding nor programming is ever required. Instead, students enjoy constructivist classrooms designed to always be active, flexible, and highly dynamic. Learning to learn and reflecting on cognitive experiences, they rigorously construct knowledge from collectively solving exciting puzzles, competing in strategic games, and participating in intellectual discussions.

  15. Artificial intelligence program in a computer application supporting reactor operations

    International Nuclear Information System (INIS)

    Stratton, R.C.; Town, G.G.

    1985-01-01

    Improving nuclear reactor power plant operability is an ever-present concern for the nuclear industry. The definition of plant operability involves a complex interaction of the ideas of reliability, safety, and efficiency. This paper presents observations concerning the issues involved and the benefits derived from the implementation of a computer application which combines traditional computer applications with artificial intelligence (AI) methodologies. A system, the Component Configuration Control System (CCCS), is being installed to support nuclear reactor operations at the Experimental Breeder Reactor II

  16. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  17. A proposal of an open ubiquitous fuzzy computing system for Ambient Intelligence

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.; Lee, R.S.T.; Lioa, V.

    2007-01-01

    Ambient Intelligence (AmI) is considered as the composition of three emergent technologies: Ubiquitous Computing, Ubiquitous Communication and Intelligent User Interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  18. On Using Intelligent Computer-Assisted Language Learning in Real-Life Foreign Language Teaching and Learning

    Science.gov (United States)

    Amaral, Luiz A.; Meurers, Detmar

    2011-01-01

    This paper explores the motivation and prerequisites for successful integration of Intelligent Computer-Assisted Language Learning (ICALL) tools into current foreign language teaching and learning (FLTL) practice. We focus on two aspects, which we argue to be important for effective ICALL system development and use: (i) the relationship between…

  19. Computational intelligence from AI to BI to NI

    Science.gov (United States)

    Werbos, Paul J.

    2015-05-01

    This paper gives highlights of the history of the neural network field, stressing the fundamental ideas which have been in play. Early neural network research was motivated mainly by the goals of artificial intelligence (AI) and of functional neuroscience (biological intelligence, BI), but the field almost died due to frustrations articulated in the famous book Perceptrons by Minsky and Papert. When I found a way to overcome the difficulties by 1974, the community mindset was very resistant to change; it was not until 1987/1988 that the field was reborn in a spectacular way, leading to the organized communities now in place. Even then, it took many more years to establish crossdisciplinary research in the types of mathematical neural networks needed to really understand the kind of intelligence we see in the brain, and to address the most demanding engineering applications. Only through a new (albeit short-lived) funding initiative, funding crossdisciplinary teams of systems engineers and neuroscientists, were we able to fund the critical empirical demonstrations which put our old basic principle of "deep learning" firmly on the map in computer science. Progress has rightly been inhibited at times by legitimate concerns about the "Terminator threat" and other possible abuses of technology. This year, at SPIE, in the quantum computing track, we outline the next stage ahead of us in breaking out of the box, again and again, and rising to fundamental challenges and opportunities still ahead of us.

  20. Ambient Intelligence and Wearable Computing: Sensors on the Body, in the Home, and Beyond

    OpenAIRE

    Cook, Diane J.; Song, WenZhan

    2009-01-01

    Ambient intelligence has a history of focusing on technologies that are integrated into a person’s environment. However, ambient intelligence can be found on a person’s body as well. In this thematic issue we examine the role of wearable computing in the field of ambient intelligence. In this article we provide an overview of the field of wearable computing and discuss its relationship to the fields of smart environments and ambient intelligence. In addition, we introduce the papers presented...

  1. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  2. Second International Joint Conference on Computational Intelligence (IJCCI 2010)

    CERN Document Server

    Correia, António; Rosa, Agostinho; Filipe, Joaquim; Computational Intelligence

    2012-01-01

    The present book includes a set of selected extended papers from the second International Joint Conference on Computational Intelligence (IJCCI 2010), held in Valencia, Spain, from 24 to 26 October 2010. The conference was composed by three co-located conferences:  The International Conference on Fuzzy Computation (ICFC), the International Conference on Evolutionary Computation (ICEC), and the International Conference on Neural Computation (ICNC). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 236 submissions, from 49 countries, in all continents. After a double blind paper review performed by the Program Committee, only 30 submissions were accepted as full papers and thus selected for oral presentation, leading to a full paper acceptance ratio of 13%. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience inte...

  3. First International Conference on Intelligent Computing and Applications

    CERN Document Server

    Kar, Rajib; Das, Swagatam; Panigrahi, Bijaya

    2015-01-01

    The idea of the 1st International Conference on Intelligent Computing and Applications (ICICA 2014) is to bring the Research Engineers, Scientists, Industrialists, Scholars and Students together from in and around the globe to present the on-going research activities and hence to encourage research interactions between universities and industries. The conference provides opportunities for the delegates to exchange new ideas, applications and experiences, to establish research relations and to find global partners for future collaboration. The proceedings covers latest progresses in the cutting-edge research on various research areas of Image, Language Processing, Computer Vision and Pattern Recognition, Machine Learning, Data Mining and Computational Life Sciences, Management of Data including Big Data and Analytics, Distributed and Mobile Systems including Grid and Cloud infrastructure, Information Security and Privacy, VLSI, Electronic Circuits, Power Systems, Antenna, Computational fluid dynamics & Hea...

  4. Contemporary cybernetics and its facets of cognitive informatics and computational intelligence.

    Science.gov (United States)

    Wang, Yingxu; Kinsner, Witold; Zhang, Du

    2009-08-01

    This paper explores the architecture, theoretical foundations, and paradigms of contemporary cybernetics from perspectives of cognitive informatics (CI) and computational intelligence. The modern domain and the hierarchical behavioral model of cybernetics are elaborated at the imperative, autonomic, and cognitive layers. The CI facet of cybernetics is presented, which explains how the brain may be mimicked in cybernetics via CI and neural informatics. The computational intelligence facet is described with a generic intelligence model of cybernetics. The compatibility between natural and cybernetic intelligence is analyzed. A coherent framework of contemporary cybernetics is presented toward the development of transdisciplinary theories and applications in cybernetics, CI, and computational intelligence.

  5. Software tool for resolution of inverse problems using artificial intelligence techniques: an application in neutron spectrometry

    International Nuclear Information System (INIS)

    Castaneda M, V. H.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Leon P, A. A.; Hernandez P, C. F.; Espinoza G, J. G.; Ortiz R, J. M.; Vega C, H. R.; Mendez, R.; Gallego, E.; Sousa L, M. A.

    2016-10-01

    The Taguchi methodology has proved to be highly efficient to solve inverse problems, in which the values of some parameters of the model must be obtained from the observed data. There are intrinsic mathematical characteristics that make a problem known as inverse. Inverse problems appear in many branches of science, engineering and mathematics. To solve this type of problem, researches have used different techniques. Recently, the use of techniques based on Artificial Intelligence technology is being explored by researches. This paper presents the use of a software tool based on artificial neural networks of generalized regression in the solution of inverse problems with application in high energy physics, specifically in the solution of the problem of neutron spectrometry. To solve this problem we use a software tool developed in the Mat Lab programming environment, which employs a friendly user interface, intuitive and easy to use for the user. This computational tool solves the inverse problem involved in the reconstruction of the neutron spectrum based on measurements made with a Bonner spheres spectrometric system. Introducing this information, the neural network is able to reconstruct the neutron spectrum with high performance and generalization capability. The tool allows that the end user does not require great training or technical knowledge in development and/or use of software, so it facilitates the use of the program for the resolution of inverse problems that are in several areas of knowledge. The techniques of Artificial Intelligence present singular veracity to solve inverse problems, given the characteristics of artificial neural networks and their network topology, therefore, the tool developed has been very useful, since the results generated by the Artificial Neural Network require few time in comparison to other techniques and are correct results comparing them with the actual data of the experiment. (Author)

  6. BRAIN. Broad Research in Artificial Intelligence and Neuroscience-Review of Recent Trends in Measuring the Computing Systems Intelligence

    OpenAIRE

    Laszlo Barna Iantovics; Adrian Gligor; Muaz A. Niazi; Anna Iuliana Biro; Sandor Miklos Szilagyi; Daniel Tokody

    2018-01-01

    Many difficult problems, from the philosophy of computation point of view, could require computing systems that have some kind of intelligence in order to be solved. Recently, we have seen a large number of artificial intelligent systems used in a number of scientific, technical and social domains. Usage of such an approach often has a focus on healthcare. These systems can provide solutions to a very large set of problems such as, but not limited to: elder patient care; medica...

  7. Training Software in Artificial-Intelligence Computing Techniques

    Science.gov (United States)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  8. Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1

    Science.gov (United States)

    1989-03-01

    American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to

  9. Advanced and intelligent computations in diagnosis and control

    CERN Document Server

    2016-01-01

    This book is devoted to the demands of research and industrial centers for diagnostics, monitoring and decision making systems that result from the increasing complexity of automation and systems, the need to ensure the highest level of reliability and safety, and continuing research and the development of innovative approaches to fault diagnosis. The contributions combine domains of engineering knowledge for diagnosis, including detection, isolation, localization, identification, reconfiguration and fault-tolerant control. The book is divided into six parts:  (I) Fault Detection and Isolation; (II) Estimation and Identification; (III) Robust and Fault Tolerant Control; (IV) Industrial and Medical Diagnostics; (V) Artificial Intelligence; (VI) Expert and Computer Systems.

  10. MANAGEMENT OPTIMISATION OF MASS CUSTOMISATION MANUFACTURING USING COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Louwrens Butler

    2018-05-01

    Full Text Available Computational intelligence paradigms can be used for advanced manufacturing system optimisation. A static simulation model of an advanced manufacturing system was developed in order to simulate a manufacturing system. The purpose of this advanced manufacturing system was to mass-produce a customisable product range at a competitive cost. The aim of this study was to determine whether this new algorithm could produce a better performance than traditional optimisation methods. The algorithm produced a lower cost plan than that for a simulated annealing algorithm, and had a lower impact on the workforce.

  11. Business Intelligence tools as an element of information supply system

    Directory of Open Access Journals (Sweden)

    Agnieszka Szmelter

    2013-12-01

    Full Text Available This paper aims to present theBusiness Intelligence toolsas an element improvingflow of information withinthe management information systemand as atool to facilitate theachieving the objectives ofinformation supply system.In the firstpart of the paperthe author presents the issuesrelatedto the specific character of information as a kind of resource and functioning ofthe information supply systemin the enterprise. The secondpart of the articleincludethe characteristics ofBusiness Intelligence systems. The thirdpart deals withthe impact ofBusiness Intelligence toolsto the ongoingactivities ofinformation supply system.

  12. Profiling nonhuman intelligence: An exercise in developing unbiased tools for describing other "types" of intelligence on earth

    Science.gov (United States)

    Herzing, Denise L.

    2014-02-01

    Intelligence has historically been studied by comparing nonhuman cognitive and language abilities with human abilities. Primate-like species, which show human-like anatomy and share evolutionary lineage, have been the most studied. However, when comparing animals of non-primate origins our abilities to profile the potential for intelligence remains inadequate. Historically our measures for nonhuman intelligence have included a variety of tools: (1) physical measurements - brain to body ratio, brain structure/convolution/neural density, presence of artifacts and physical tools, (2) observational and sensory measurements - sensory signals, complexity of signals, cross-modal abilities, social complexity, (3) data mining - information theory, signal/noise, pattern recognition, (4) experimentation - memory, cognition, language comprehension/use, theory of mind, (5) direct interfaces - one way and two way interfaces with primates, dolphins, birds and (6) accidental interactions - human/animal symbiosis, cross-species enculturation. Because humans tend to focus on "human-like" attributes and measures and scientists are often unwilling to consider other "types" of intelligence that may not be human equated, our abilities to profile "types" of intelligence that differ on a variety of scales is weak. Just as biologists stretch their definitions of life to look at extremophiles in unusual conditions, so must we stretch our descriptions of types of minds and begin profiling, rather than equating, other life forms we may encounter.

  13. Use of Business Intelligence Tools in the DSN

    Science.gov (United States)

    Statman, Joseph I.; Zendejas, Silvino C.

    2010-01-01

    JPL has operated the Deep Space Network (DSN) on behalf of NASA since the 1960's. Over the last two decades, the DSN budget has generally declined in real-year dollars while the aging assets required more attention, and the missions became more complex. As a result, the DSN budget has been increasingly consumed by Operations and Maintenance (O&M), significantly reducing the funding wedge available for technology investment and for enhancing the DSN capability and capacity. Responding to this budget squeeze, the DSN launched an effort to improve the cost-efficiency of the O&M. In this paper we: elaborate on the methodology adopted to understand "where the time and money are used"-surprisingly, most of the data required for metrics development was readily available in existing databases-we have used commercial Business Intelligence (BI) tools to mine the databases and automatically extract the metrics (including trends) and distribute them weekly to interested parties; describe the DSN-specific effort to convert the intuitive understanding of "where the time is spent" into meaningful and actionable metrics that quantify use of resources, highlight candidate areas of improvement, and establish trends; and discuss the use of the BI-derived metrics-one of the most fascinating processes was the dramatic improvement in some areas of operations when the metrics were shared with the operators-the visibility of the metrics, and a self-induced competition, caused almost immediate improvement in some areas. While the near-term use of the metrics is to quantify the processes and track the improvement, these techniques will be just as useful in monitoring the process, e.g. as an input to a lean-six-sigma process.

  14. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  15. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  16. Intelligent Information Retrieval: Diagnosing Information Need. Part I. The Theoretical Framework for Developing an Intelligent IR Tool.

    Science.gov (United States)

    Cole, Charles

    1998-01-01

    Suggests that the principles underlying the procedure used by doctors to diagnose a patient's disease are useful in the design of intelligent information-retrieval systems because the task of the doctor is conceptually similar to the computer or human intermediary's task in information retrieval: to draw out the user's query/information need.…

  17. MAINS: MULTI-AGENT INTELLIGENT SERVICE ARCHITECTURE FOR CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    T. Joshva Devadas

    2014-04-01

    Full Text Available Computing has been transformed to a model having commoditized services. These services are modeled similar to the utility services water and electricity. The Internet has been stunningly successful over the course of past three decades in supporting multitude of distributed applications and a wide variety of network technologies. However, its popularity has become the biggest impediment to its further growth with the handheld devices mobile and laptops. Agents are intelligent software system that works on behalf of others. Agents are incorporated in many innovative applications in order to improve the performance of the system. Agent uses its possessed knowledge to react with the system and helps to improve the performance. Agents are introduced in the cloud computing is to minimize the response time when similar request is raised from an end user in the globe. In this paper, we have introduced a Multi Agent Intelligent system (MAINS prior to cloud service models and it was tested using sample dataset. Performance of the MAINS layer was analyzed in three aspects and the outcome of the analysis proves that MAINS Layer provides a flexible model to create cloud applications and deploying them in variety of applications.

  18. Airline company management: 'Defining of necessary number of employees in airline by using artificial intelligence tools'

    OpenAIRE

    Petrović, Dragan M.; Puharic, Mirjana A.; Jovanović, Tomislav Ž.

    2015-01-01

    In this paper the model for preliminary estimation of number of employees in airline by using of artificial intelligence tools. It is assumed that the tools of artificial intelligence can be applied even for complex tasks such as defining the number of employees in the airline. The results obtained can be used for planning the number of employees, ie. planning the necessary financial investments in human resources, and may also be useful for a preliminary analysis of the airlines that choose ...

  19. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  20. Computer Aided Automatic Control - CAAC artificial intelligence block

    Energy Technology Data Exchange (ETDEWEB)

    Balate, J.; Chramcov, B.; Princ, M. [Brno Univ. of Technology (Czech Republic). Faculty of Technology in Zlin

    2000-07-01

    The aim of the plan to build up the system CAAC - Computer Aided Automatic Control is to create modular setup of partial computing programs including theory of automatic control, algorithms of programs for processing signals and programs of control algorithms. To approach its informative contents to students and professional public the CAAC system utilizes Internet services http in the form of WWW pages. The CAAC system is being processed at the Institute of Automation and Control Technique of the Faculty of Technology in Zlin of the Brno University of Technology and is determined particularly for pedagogic purposes. Recently also the methods of artificial intelligence have been included to the open CAAC system and that is comprised in this article. (orig.)

  1. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  2. New Research Perspectives in the Emerging Field of Computational Intelligence to Economic Modeling

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2009-01-01

    Full Text Available Computational Intelligence (CI is a new development paradigm of intelligentsystems which has resulted from a synergy between fuzzy sets, artificial neuralnetworks, evolutionary computation, machine learning, etc., broadeningcomputer science, physics, economics, engineering, mathematics, statistics. It isimperative to know why these tools can be potentially relevant and effective toeconomic and financial modeling. This paper presents, after a synergic newparadigm of intelligent systems, as a practical case study the fuzzy and temporalproperties of knowledge formalism embedded in an Intelligent Control System(ICS, based on FT-algorithm. We are not dealing high with level reasoningmethods, because we think that real-time problems can only be solved by ratherlow-level reasoning. Most of the overall run-time of fuzzy expert systems isused in the match phase. To achieve a fast reasoning the number of fuzzy setoperations must be reduced. For this, we use a fuzzy compiled structure ofknowledge, like Rete, because it is required for real-time responses. Solving thematch-time predictability problem would allow us to built much more powerfulreasoning techniques.

  3. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Fu; Hope, A D; Javed, M [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1998-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  4. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  5. Artificial intelligence for the EChO mission planning tool

    Science.gov (United States)

    Garcia-Piquer, Alvaro; Ribas, Ignasi; Colomé, Josep

    2015-12-01

    The Exoplanet Characterisation Observatory (EChO) has as its main goal the measurement of atmospheres of transiting planets. This requires the observation of two types of events: primary and secondary eclipses. In order to yield measurements of sufficient Signal-to-Noise Ratio to fulfil the mission objectives, the events of each exoplanet have to be observed several times. In addition, several criteria have to be considered to carry out each observation, such as the exoplanet visibility, its event duration, and no overlapping with other tasks. It is expected that a suitable mission plan increases the efficiency of telescope operation, which will represent an important benefit in terms of scientific return and operational costs. Nevertheless, to obtain a long term mission plan becomes unaffordable for human planners due to the complexity of computing the huge number of possible combinations for finding an optimum solution. In this contribution we present a long term mission planning tool based on Genetic Algorithms, which are focused on solving optimization problems such as the planning of several tasks. Specifically, the proposed tool finds a solution that highly optimizes the defined objectives, which are based on the maximization of the time spent on scientific observations and the scientific return (e.g., the coverage of the mission survey). The results obtained on the large experimental set up support that the proposed scheduler technology is robust and can function in a variety of scenarios, offering a competitive performance which does not depend on the collection of exoplanets to be observed. Specifically, the results show that, with the proposed tool, EChO uses 94% of the available time of the mission, so the amount of downtime is small, and it completes 98% of the targets.

  6. Mobile Imaging and Computing for Intelligent Structural Damage Inspection

    Directory of Open Access Journals (Sweden)

    ZhiQiang Chen

    2014-01-01

    Full Text Available Optical imaging is a commonly used technique in civil engineering for aiding the archival of damage scenes and more recently for image analysis-based damage quantification. However, the limitations are evident when applying optical imaging in the field. The most significant one is the lacking of computing and processing capability in the real time. The advancement of mobile imaging and computing technologies provides a promising opportunity to change this norm. This paper first provides a timely introduction of the state-of-the-art mobile imaging and computing technologies for the purpose of engineering application development. Further we propose a mobile imaging and computing (MIC framework for conducting intelligent condition assessment for constructed objects, which features in situ imaging and real-time damage analysis. This framework synthesizes advanced mobile technologies with three innovative features: (i context-enabled image collection, (ii interactive image preprocessing, and (iii real-time image analysis and analytics. Through performance evaluation and field experiments, this paper demonstrates the feasibility and efficiency of the proposed framework.

  7. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    Science.gov (United States)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  8. Business intelligence as a tool in the management academic

    Directory of Open Access Journals (Sweden)

    Juan Jose Camargo Vega

    2016-06-01

    Full Text Available This paper presents a study, analyze and evaluate characteristics of existing data in the academic community, in order to recommend a model to apply Business Intelligence. It starts from the assumption that knowing the effects on the academic community, if you have a corporate strategy to facilitate decision-making in educational institutions. The results are based on a three dimensional cube, which are combining strong to make decisions with the information made. Finally, we come to different conclusions enabling give sufficient grounds to recommend a model that Integrate Business Intelligence in the academic environment.

  9. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  10. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  11. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  12. Artificial Intelligence and the Teaching of Reading and Writing by Computers.

    Science.gov (United States)

    Balajthy, Ernest

    1985-01-01

    Discusses how computers can "converse" with students for teaching purposes, demonstrates how these interactions are becoming more complex, and explains how the computer's role is becoming more "human" in giving intelligent responses to students. (HOD)

  13. Best of Affective Computing and Intelligent Interaction 2013 in Multimodal Interactions

    NARCIS (Netherlands)

    Soleymani, Mohammad; Soleymani, M.; Pun, T.; Pun, Thierry; Nijholt, Antinus

    The fifth biannual Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2013) was held in Geneva, Switzerland. This conference featured the recent advancement in affective computing and relevant applications in education, entertainment and health. A number of

  14. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  15. Recent advances in computational intelligence in defense and security

    CERN Document Server

    Falcon, Rafael; Zincir-Heywood, Nur; Abbass, Hussein

    2016-01-01

    This volume is an initiative undertaken by the IEEE Computational Intelligence Society’s Task Force on Security, Surveillance and Defense to consolidate and disseminate the role of CI techniques in the design, development and deployment of security and defense solutions. Applications range from the detection of buried explosive hazards in a battlefield to the control of unmanned underwater vehicles, the delivery of superior video analytics for protecting critical infrastructures or the development of stronger intrusion detection systems and the design of military surveillance networks. Defense scientists, industry experts, academicians and practitioners alike will all benefit from the wide spectrum of successful applications compiled in this volume. Senior undergraduate or graduate students may also discover uncharted territory for their own research endeavors.

  16. ANALISIS PERANCANGAN BUSINESS INTELLIGENCE BERBASIS SAAS CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    I Gede Adnyana

    2014-05-01

    Full Text Available Persaingan bisnis yang ketat, mendorong setiap perusahaan menyusun strategi bisnis agar dapat bertahan dari para pesaing. Penyusunan strategi bisnis mutlak memerlukan informasi yang tepat dan akurat, pengolahan hingga analisis data yang menghasilkan informasi yang tepat dan akurat menjadi proses yang sangat penting. Business Intelligence (BI menawarkan solusi bisnis untuk menganalisis data dan memungkinkan suatu perusahaan untuk mengambil keputusan untuk meningkatkan keuntungan dan kinerja bisnis. Namun, BI mahal untuk diimplementasikan, memerlukan biaya pemeliharaan yang tidak sedikit dan infrastruktur yang kuat. Hal ini mendorong perusahaan mengurangi biaya tetapi masih memiliki teknologi yang tepat untuk memungkinkan mereka untuk membuat keputusan, mengidentifikasi peluang dan proaktif mengidentifikasi risiko yang dapat mempengaruhi bisnis. Konsep Software as a Service (SaaS Cloud Computing dapat menjawab tantangan yang dihadapi BI. Sebelum merancang BI berbasis SaaS perlu diketahui parameter-parameter evaluasi hingga kelebihan dan kekurangannya.

  17. 13th International Conference on Distributed Computing and Artificial Intelligence

    CERN Document Server

    Silvestri, Marcello; González, Sara

    2016-01-01

    The special session Decision Economics (DECON) 2016 is a scientific forum by which to share ideas, projects, researches results, models and experiences associated with the complexity of behavioral decision processes aiming at explaining socio-economic phenomena. DECON 2016 held in the University of Seville, Spain, as part of the 13th International Conference on Distributed Computing and Artificial Intelligence (DCAI) 2016. In the tradition of Herbert A. Simon’s interdisciplinary legacy, this book dedicates itself to the interdisciplinary study of decision-making in the recognition that relevant decision-making takes place in a range of critical subject areas and research fields, including economics, finance, information systems, small and international business, management, operations, and production. Decision-making issues are of crucial importance in economics. Not surprisingly, the study of decision-making has received a growing empirical research efforts in the applied economic literature over the last ...

  18. A COMPARISON BETWEEN THREE PREDICTIVE MODELS OF COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    DUMITRU CIOBANU

    2013-12-01

    Full Text Available Time series prediction is an open problem and many researchers are trying to find new predictive methods and improvements for the existing ones. Lately methods based on neural networks are used extensively for time series prediction. Also, support vector machines have solved some of the problems faced by neural networks and they began to be widely used for time series prediction. The main drawback of those two methods is that they are global models and in the case of a chaotic time series it is unlikely to find such model. In this paper it is presented a comparison between three predictive from computational intelligence field one based on neural networks one based on support vector machine and another based on chaos theory. We show that the model based on chaos theory is an alternative to the other two methods.

  19. Intelligent Tutoring Systems for Collaborative Learning: Enhancements to Authoring Tools

    Science.gov (United States)

    Olsen, Jennifer K.; Belenky, Daniel M.; Aleven, Vincent; Rummel, Nikol

    2013-01-01

    Collaborative and individual instruction may support different types of knowledge. Optimal instruction for a subject domain may therefore need to combine these two modes of instruction. There has not been much research, however, on combining individual and collaborative learning with Intelligent Tutoring Systems (ITSs). A first step is to expand…

  20. Prediction of 5-year overall survival in cervical cancer patients treated with radical hysterectomy using computational intelligence methods.

    Science.gov (United States)

    Obrzut, Bogdan; Kusy, Maciej; Semczuk, Andrzej; Obrzut, Marzanna; Kluska, Jacek

    2017-12-12

    Computational intelligence methods, including non-linear classification algorithms, can be used in medical research and practice as a decision making tool. This study aimed to evaluate the usefulness of artificial intelligence models for 5-year overall survival prediction in patients with cervical cancer treated by radical hysterectomy. The data set was collected from 102 patients with cervical cancer FIGO stage IA2-IIB, that underwent primary surgical treatment. Twenty-three demographic, tumor-related parameters and selected perioperative data of each patient were collected. The simulations involved six computational intelligence methods: the probabilistic neural network (PNN), multilayer perceptron network, gene expression programming classifier, support vector machines algorithm, radial basis function neural network and k-Means algorithm. The prediction ability of the models was determined based on the accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve. The results of the computational intelligence methods were compared with the results of linear regression analysis as a reference model. The best results were obtained by the PNN model. This neural network provided very high prediction ability with an accuracy of 0.892 and sensitivity of 0.975. The area under the receiver operating characteristics curve of PNN was also high, 0.818. The outcomes obtained by other classifiers were markedly worse. The PNN model is an effective tool for predicting 5-year overall survival in cervical cancer patients treated with radical hysterectomy.

  1. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  2. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  3. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  4. Developing an Intelligent Diagnosis and Assessment E-Learning Tool for Introductory Programming

    Science.gov (United States)

    Huang, Chenn-Jung; Chen, Chun-Hua; Luo, Yun-Cheng; Chen, Hong-Xin; Chuang, Yi-Ta

    2008-01-01

    Recently, a lot of open source e-learning platforms have been offered for free in the Internet. We thus incorporate the intelligent diagnosis and assessment tool into an open software e-learning platform developed for programming language courses, wherein the proposed learning diagnosis assessment tools based on text mining and machine learning…

  5. Artificial Intelligence as a Business Forecasting and Error Handling Tool

    OpenAIRE

    Md. Tabrez Quasim; Rupak Chattopadhyay

    2015-01-01

     Any business enterprise must rely a lot on how well it can predict the future happenings. To cope up with the modern global customer demand, technological challenges, market competitions etc., any organization is compelled to foresee the future having maximum impact and least chances of errors. The traditional forecasting approaches have some limitations. That is why the business world is adopting the modern Artificial Intelligence based forecasting techniques. This paper has tried to presen...

  6. 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    Studies in Computational Intelligence : Volume 492

    2013-01-01

    This edited book presents scientific results of the 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2013), held in Honolulu, Hawaii, USA on July 1-3, 2013. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 17 outstanding papers from those papers accepted for presentation at the conference.  

  7. 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2015-01-01

    This edited book presents scientific results of 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2014) held on June 30 – July 2, 2014 in Las Vegas Nevada, USA. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 13 outstanding papers from those papers accepted for presentation at the conference.

  8. Novel approach for dam break flow modeling using computational intelligence

    Science.gov (United States)

    Seyedashraf, Omid; Mehrabi, Mohammad; Akhtari, Ali Akbar

    2018-04-01

    A new methodology based on the computational intelligence (CI) system is proposed and tested for modeling the classic 1D dam-break flow problem. The reason to seek for a new solution lies in the shortcomings of the existing analytical and numerical models. This includes the difficulty of using the exact solutions and the unwanted fluctuations, which arise in the numerical results. In this research, the application of the radial-basis-function (RBF) and multi-layer-perceptron (MLP) systems is detailed for the solution of twenty-nine dam-break scenarios. The models are developed using seven variables, i.e. the length of the channel, the depths of the up-and downstream sections, time, and distance as the inputs. Moreover, the depths and velocities of each computational node in the flow domain are considered as the model outputs. The models are validated against the analytical, and Lax-Wendroff and MacCormack FDM schemes. The findings indicate that the employed CI models are able to replicate the overall shape of the shock- and rarefaction-waves. Furthermore, the MLP system outperforms RBF and the tested numerical schemes. A new monolithic equation is proposed based on the best fitting model, which can be used as an efficient alternative to the existing piecewise analytic equations.

  9. Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.

    Science.gov (United States)

    Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu

    2017-05-23

    This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.

  10. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Sippel; William C. Carrigan; Kenneth D. Luff; Lyn Canter

    2003-11-12

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). The software tools in ICS have been developed for characterization of reservoir properties and evaluation of hydrocarbon potential using a combination of inter-disciplinary data sources such as geophysical, geologic and engineering variables. The ICS tools provide a means for logical and consistent reservoir characterization and oil reserve estimates. The tools can be broadly characterized as (1) clustering tools, (2) neural solvers, (3) multiple-linear regression, (4) entrapment-potential calculator and (5) file utility tools. ICS tools are extremely flexible in their approach and use, and applicable to most geologic settings. The tools are primarily designed to correlate relationships between seismic information and engineering and geologic data obtained from wells, and to convert or translate seismic information into engineering and geologic terms or units. It is also possible to apply ICS in a simple framework that may include reservoir characterization using only engineering, seismic, or geologic data in the analysis. ICS tools were developed and tested using geophysical, geologic and engineering data obtained from an exploitation and development project involving the Red River Formation in Bowman County, North Dakota and Harding County, South Dakota. Data obtained from 3D seismic surveys, and 2D seismic lines encompassing nine prospective field areas were used in the analysis. The geologic setting of the Red River Formation in Bowman and Harding counties is that of a shallow-shelf, carbonate system. Present-day depth of the Red River formation is approximately 8000 to 10,000 ft below ground surface. This report summarizes production results from well demonstration activity, results of reservoir characterization of the Red River Formation at demonstration sites, descriptions of ICS tools and strategies for their application.

  11. Analysis of Computer-Aided and Artificial Intelligence Technologies and Solutions in Service Industries in Russia

    OpenAIRE

    Rezanov, Vladislav

    2013-01-01

    The primary objective of this research study was to investigate the relationship between Computer-Aided and Artificial Intelligence Technologies and customer satisfaction in the context of businesses in Russia. The research focuses on methods of Artificial Intelligence technology application in business and its effect on customer satisfaction. The researcher introduces Artificial Intelligence and studies the forecasting approaches in relation to business operations. The rese...

  12. Forensic drug intelligence: an important tool in law enforcement.

    Science.gov (United States)

    Esseiva, Pierrre; Ioset, Sylvain; Anglada, Frédéric; Gasté, Laëtitia; Ribaux, Olivier; Margot, Pierre; Gallusser, Alain; Biedermann, Alex; Specht, Yves; Ottinger, Edmond

    2007-04-11

    Organised criminality is a great concern for national/international security. The demonstration of complex crimes is increasingly dependant on knowledge distributed within law-enforcement agencies and scientific disciplines. This separation of knowledge creates difficulties in reconstructing and prosecuting such crimes. Basic interdisciplinary research in drug intelligence combined with crime analysis, forensic intelligence, and traditional law enforcement investigation is leading to important advances in crime investigation support. Laboratory results constitute one highly dependable source of information that is both reliable and testable. Their operational use can support investigation and even provide undetected connections or organisation of structure. The foremost difficulties encountered by drug analysts are not principally of a chemical or analytical nature, but methodologies to extract parameters or features that are deemed to be crucial for handling and contextualising drug profiling data. An organised memory has been developed in order to provide accurate, timely, useful and meaningful information for linking spatially and temporally distinct events on a national and international level (including cross-border phenomena). Literature has already pointed out that forensic case data are amenable for use in an intelligence perspective if data and knowledge of specialised actors are appropriately organised, shared and processed. As a particular form of forensic case data, the authors' research focuses on parameters obtained through the systematic physical and chemical profiling of samples of illicit drugs. The procedure is used to infer and characterise links between samples that originate from the same and different seizures. The discussion will not, however, focus on how samples are actually analysed and compared as substantial literature on this topic already exists. Rather, attention is primarily drawn to an active and close collaboration between

  13. Design tools for computer-generated display of information to operators

    International Nuclear Information System (INIS)

    O'Brien, J.F.; Cain, D.G.; Sun, B.K.H.

    1985-01-01

    More and more computers are being used to process and display information to operators who control nuclear power plants. Implementation of computer-generated displays in power plant control rooms represents a considerable design challenge for industry designers. Over the last several years, the EPRI has conducted research aimed at providing industry designers tools to meet this new design challenge. These tools provide guidance in defining more 'intelligent' information for plant control and in developing effective displays to communicate this information to the operators. (orig./HP)

  14. A prototype system for perinatal knowledge engineering using an artificial intelligence tool.

    Science.gov (United States)

    Sokol, R J; Chik, L

    1988-01-01

    Though several perinatal expert systems are extant, the use of artificial intelligence has, as yet, had minimal impact in medical computing. In this evaluation of the potential of AI techniques in the development of a computer based "Perinatal Consultant," a "top down" approach to the development of a perinatal knowledge base was taken, using as a source for such a knowledge base a 30-page manuscript of a chapter concerning high risk pregnancy. The UNIX utility "style" was used to parse sentences and obtain key words and phrases, both as part of a natural language interface and to identify key perinatal concepts. Compared with the "gold standard" of sentences containing key facts as chosen by the experts, a semiautomated method using a nonmedical speller to identify key words and phrases in context functioned with a sensitivity of 79%, i.e., approximately 8 in 10 key sentences were detected as the basis for PROLOG, rules and facts for the knowledge base. These encouraging results suggest that functional perinatal expert systems may well be expedited by using programming utilities in conjunction with AI tools and published literature.

  15. Artificial intelligence, neural network, and Internet tool integration in a pathology workstation to improve information access

    Science.gov (United States)

    Sargis, J. C.; Gray, W. A.

    1999-03-01

    The APWS allows user friendly access to several legacy systems which would normally each demand domain expertise for proper utilization. The generalized model, including objects, classes, strategies and patterns is presented. The core components of the APWS are the Microsoft Windows 95 Operating System, Oracle, Oracle Power Objects, Artificial Intelligence tools, a medical hyperlibrary and a web site. The paper includes a discussion of how could be automated by taking advantage of the expert system, object oriented programming and intelligent relational database tools within the APWS.

  16. Airline company management: 'Defining of necessary number of employees in airline by using artificial intelligence tools'

    Directory of Open Access Journals (Sweden)

    Petrović Dragan M.

    2015-01-01

    Full Text Available In this paper the model for preliminary estimation of number of employees in airline by using of artificial intelligence tools. It is assumed that the tools of artificial intelligence can be applied even for complex tasks such as defining the number of employees in the airline. The results obtained can be used for planning the number of employees, ie. planning the necessary financial investments in human resources, and may also be useful for a preliminary analysis of the airlines that choose to do restructuring or plan to increase/decrease the number of operations. Results were compared with those obtained by regression analysis.

  17. 4th INNS Symposia Series on Computational Intelligence in Information Systems

    CERN Document Server

    Au, Thien

    2015-01-01

    This book constitutes the refereed proceedings of the Fourth International Neural Network Symposia series on Computational Intelligence in Information Systems, INNS-CIIS 2014, held in Bandar Seri Begawan, Brunei in November 2014. INNS-CIIS aims to provide a platform for researchers to exchange the latest ideas and present the most current research advances in general areas related to computational intelligence and its applications in various domains. The 34 revised full papers presented in this book have been carefully reviewed and selected from 72 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.  

  18. FTRA 4th International Conference on Mobile, Ubiquitous, and Intelligent Computing

    CERN Document Server

    Adeli, Hojjat; Park, Namje; Woungang, Isaac

    2014-01-01

    MUSIC 2013 will be the most comprehensive text focused on the various aspects of Mobile, Ubiquitous and Intelligent computing. MUSIC 2013 provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of intelligent technologies in mobile and ubiquitous computing environment. MUSIC 2013 is the next edition of the 3rd International Conference on Mobile, Ubiquitous, and Intelligent Computing (MUSIC-12, Vancouver, Canada, 2012) which was the next event in a series of highly successful International Workshop on Multimedia, Communication and Convergence technologies MCC-11 (Crete, Greece, June 2011), MCC-10 (Cebu, Philippines, August 2010).

  19. An intelligent tool for the training of nuclear plant operators

    International Nuclear Information System (INIS)

    Cordier, B.

    1990-01-01

    A new type of pedagogical tool has been developped for the training of nuclear power plant operation. This tool combines simulation and expert system. The first process developped is about Steam Generator Tube Rupture (S.G.T.R.). All nuclear power plants will be equiped with this system in 1989 and 1990. After this first experiment, others processes will be developped for this tool

  20. 16th International Conference on Hybrid Intelligent Systems and the 8th World Congress on Nature and Biologically Inspired Computing

    CERN Document Server

    Haqiq, Abdelkrim; Alimi, Adel; Mezzour, Ghita; Rokbani, Nizar; Muda, Azah

    2017-01-01

    This book presents the latest research in hybrid intelligent systems. It includes 57 carefully selected papers from the 16th International Conference on Hybrid Intelligent Systems (HIS 2016) and the 8th World Congress on Nature and Biologically Inspired Computing (NaBIC 2016), held on November 21–23, 2016 in Marrakech, Morocco. HIS - NaBIC 2016 was jointly organized by the Machine Intelligence Research Labs (MIR Labs), USA; Hassan 1st University, Settat, Morocco and University of Sfax, Tunisia. Hybridization of intelligent systems is a promising research field in modern artificial/computational intelligence and is concerned with the development of the next generation of intelligent systems. The conference’s main aim is to inspire further exploration of the intriguing potential of hybrid intelligent systems and bio-inspired computing. As such, the book is a valuable resource for practicing engineers /scientists and researchers working in the field of computational intelligence and artificial intelligence.

  1. Intelligent Systems for Aerospace Engineering - An Overview

    National Research Council Canada - National Science Library

    Krishnakumar, K

    2003-01-01

    Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends...

  2. A Web-Based Authoring Tool for Algebra-Related Intelligent Tutoring Systems

    Directory of Open Access Journals (Sweden)

    Maria Virvou

    2000-01-01

    Full Text Available This paper describes the development of a web-based authoring tool for Intelligent Tutoring Systems. The tool aims to be useful to teachers and students of domains that make use of algebraic equations. The initial input to the tool is a "description" of a specific domain given by a human teacher. In return the tool provides assistance at the construction of exercises by the human teacher and then monitors the students while they are solving the exercises and provides appropriate feedback. The tool incorporates intelligence in its diagnostic component, which performs error diagnosis to students’ errors. It also handles the teaching material in a flexible and individualised way.

  3. The Relationship between Emotional Intelligence and Attitudes toward Computer-Based Instruction of Postsecondary Hospitality Students

    Science.gov (United States)

    Behnke, Carl; Greenan, James P.

    2011-01-01

    This study examined the relationship between postsecondary students' emotional-social intelligence and attitudes toward computer-based instructional materials. Research indicated that emotions and emotional intelligence directly impact motivation, while instructional design has been shown to impact student attitudes and subsequent engagement with…

  4. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    Science.gov (United States)

    1987-10-01

    include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen

  5. Towards the Development of Web-based Business intelligence Tools

    DEFF Research Database (Denmark)

    Georgiev, Lachezar; Tanev, Stoyan

    2011-01-01

    This paper focuses on using web search techniques in examining the co-creation strategies of technology driven firms. It does not focus on the co-creation results but describes the implementation of a software tool using data mining techniques to analyze the content on firms’ websites. The tool...

  6. Application of Computational Intelligence to Improve Education in Smart Cities

    Science.gov (United States)

    Gaffo, Fernando Henrique; de Barros, Rodolfo Miranda; Mendes, Leonardo de Souza

    2018-01-01

    According to UNESCO, education is a fundamental human right and every nation’s citizens should be granted universal access with equal quality to it. Because this goal is yet to be achieved in most countries, in particular in the developing and underdeveloped countries, it is extremely important to find more effective ways to improve education. This paper presents a model based on the application of computational intelligence (data mining and data science) that leads to the development of the student’s knowledge profile and that can help educators in their decision making for best orienting their students. This model also tries to establish key performance indicators to monitor objectives’ achievement within individual strategic planning assembled for each student. The model uses random forest for classification and prediction, graph description for data structure visualization and recommendation systems to present relevant information to stakeholders. The results presented were built based on the real dataset obtained from a Brazilian private k-9 (elementary school). The obtained results include correlations among key data, a model to predict student performance and recommendations that were generated for the stakeholders. PMID:29346288

  7. Application of Computational Intelligence to Improve Education in Smart Cities.

    Science.gov (United States)

    Gomede, Everton; Gaffo, Fernando Henrique; Briganó, Gabriel Ulian; de Barros, Rodolfo Miranda; Mendes, Leonardo de Souza

    2018-01-18

    According to UNESCO, education is a fundamental human right and every nation's citizens should be granted universal access with equal quality to it. Because this goal is yet to be achieved in most countries, in particular in the developing and underdeveloped countries, it is extremely important to find more effective ways to improve education. This paper presents a model based on the application of computational intelligence (data mining and data science) that leads to the development of the student's knowledge profile and that can help educators in their decision making for best orienting their students. This model also tries to establish key performance indicators to monitor objectives' achievement within individual strategic planning assembled for each student. The model uses random forest for classification and prediction, graph description for data structure visualization and recommendation systems to present relevant information to stakeholders. The results presented were built based on the real dataset obtained from a Brazilian private k-9 (elementary school). The obtained results include correlations among key data, a model to predict student performance and recommendations that were generated for the stakeholders.

  8. Application of Computational Intelligence to Improve Education in Smart Cities

    Directory of Open Access Journals (Sweden)

    Everton Gomede

    2018-01-01

    Full Text Available According to UNESCO, education is a fundamental human right and every nation’s citizens should be granted universal access with equal quality to it. Because this goal is yet to be achieved in most countries, in particular in the developing and underdeveloped countries, it is extremely important to find more effective ways to improve education. This paper presents a model based on the application of computational intelligence (data mining and data science that leads to the development of the student’s knowledge profile and that can help educators in their decision making for best orienting their students. This model also tries to establish key performance indicators to monitor objectives’ achievement within individual strategic planning assembled for each student. The model uses random forest for classification and prediction, graph description for data structure visualization and recommendation systems to present relevant information to stakeholders. The results presented were built based on the real dataset obtained from a Brazilian private k-9 (elementary school. The obtained results include correlations among key data, a model to predict student performance and recommendations that were generated for the stakeholders.

  9. SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach

    Science.gov (United States)

    Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore

    The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.

  10. Applied Computational Intelligence in Engineering and Information Technology Revised and Selected Papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011

    CERN Document Server

    Precup, Radu-Emil; Preitl, Stefan

    2012-01-01

    This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A...

  11. Computer Assessed Design – A Vehicle of Architectural Communication and a Design Tool

    OpenAIRE

    Petrovici, Liliana-Mihaela

    2012-01-01

    In comparison with the limits of the traditional representation tools, the development of the computer graphics constitutes an opportunity to assert architectural values. The differences between communication codes of the architects and public are diminished; the architectural ideas can be represented in a coherent, intelligible and attractive way, so that they get more chances to be materialized according to the thinking of the creator. Concurrently, the graphic software have been improving ...

  12. Intelligent tools for building a scientific information platform advanced architectures and solutions

    CERN Document Server

    Skonieczny, Lukasz; Rybinski, Henryk; Kryszkiewicz, Marzena; Niezgodka, Marek

    2013-01-01

    This book is a selection of results obtained within two years of research per- formed under SYNAT - a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The selection refers to the research in artificial intelligence, knowledge discovery and data mining, information retrieval and natural language processing, addressing the problems of implementing intelligent tools for building a scientific information platform.This book is a continuation and extension of the ideas presented in “Intelligent Tools for Building a Scientific Information Platform” published as volume 390 in the same series in 2012. It is based on the SYNAT 2012 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering.  

  13. 3rd International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Biswal, Bhabendra; Udgata, Siba; Mandal, JK

    2015-01-01

    Volume 1 contains 95 papers presented at FICTA 2014: Third International Conference on Frontiers in Intelligent Computing: Theory and Applications. The conference was held during 14-15, November, 2014 at Bhubaneswar, Odisha, India.  This volume contains papers mainly focused on Data Warehousing and Mining, Machine Learning, Mobile and Ubiquitous Computing, AI, E-commerce & Distributed Computing and Soft Computing, Evolutionary Computing, Bio-inspired Computing and its Applications.

  14. Computational intelligence paradigms in economic and financial decision making

    CERN Document Server

    Resta, Marina

    2016-01-01

    The book focuses on a set of cutting-edge research techniques, highlighting the potential of soft computing tools in the analysis of economic and financial phenomena and in providing support for the decision-making process. In the first part the textbook presents a comprehensive and self-contained introduction to the field of self-organizing maps, elastic maps and social network analysis tools and provides necessary background material on the topic, including a discussion of more recent developments in the field. In the second part the focus is on practical applications, with particular attention paid to budgeting problems, market simulations, and decision-making processes, and on how such problems can be effectively managed by developing proper methods to automatically detect certain patterns. The book offers a valuable resource for both students and practitioners with an introductory-level college math background.

  15. Making Friends in Dark Shadows: An Examination of the Use of Social Computing Strategy Within the United States Intelligence Community Since 9/11

    Directory of Open Access Journals (Sweden)

    Andrew Chomik

    2011-01-01

    Full Text Available The tragic events of 9/11/2001 in the United States highlighted failures in communication and cooperation in the U.S. intelligence community. Agencies within the community failed to “connect the dots” by not collaborating in intelligence gathering efforts, which resulted in severe gaps in data sharing that eventually contributed to the terrorist attack on American soil. Since then, and under the recommendation made by the 9/11 Commission Report, the United States intelligence community has made organizational and operational changes to intelligence gathering and sharing, primarily with the creation of the Office of the Director of National Intelligence (ODNI. The ODNI has since introduced a series of web-based social computing tools to be used by all members of the intelligence community, primarily with its closed-access wiki entitled “Intellipedia” and their social networking service called “A-Space”. This paper argues that, while the introduction of these and other social computing tools have been adopted successfully into the intelligence workplace, they have reached a plateau in their use and serve only as complementary tools to otherwise pre-existing information sharing processes. Agencies continue to ‘stove-pipe’ their respective data, a chronic challenge that plagues the community due to bureaucratic policy, technology use and workplace culture. This paper identifies and analyzes these challenges, and recommends improvements in the use of these tools, both in the business processes behind them and the technology itself. These recommendations aim to provide possible solutions for using these social computing tools as part of a more trusted, collaborative information sharing process.

  16. ARGUMENTS ON USING COMPUTER-ASSISTED AUDIT TECHNIQUES (CAAT AND BUSINESS INTELLIGENCE TO IMPROVE THE WORK OF THE FINANCIAL AUDITOR

    Directory of Open Access Journals (Sweden)

    Ciprian-Costel, MUNTEANU

    2014-11-01

    Full Text Available In the 21st century, one of the most efficient ways to achieve an independent audit and quality opinion is by using information from the organization database, mainly documents in electronic format. With the help of Computer-Assisted Audit Techniques (CAAT, the financial auditor analyzes part or even all the data about a company in reference to other information within or outside the entity. The main purpose of this paper is to show the benefits of evolving from traditional audit techniques and tools to modern and , why not, visionary CAAT, which are supported by business intelligence systems. Given the opportunity to perform their work in IT environments, the auditors would start using the tools of business intelligence, a key factor which contributes to making successful business decisions . CAAT enable auditors to test large amount of data quickly and accurately and therefore increase the confidence they have in their opinion.

  17. [Intelligent systems tools in the diagnosis of acute coronary syndromes: A systemic review].

    Science.gov (United States)

    Sprockel, John; Tejeda, Miguel; Yate, José; Diaztagle, Juan; González, Enrique

    2017-03-27

    Acute myocardial infarction is the leading cause of non-communicable deaths worldwide. Its diagnosis is a highly complex task, for which modelling through automated methods has been attempted. A systematic review of the literature was performed on diagnostic tests that applied intelligent systems tools in the diagnosis of acute coronary syndromes. A systematic review of the literature is presented using Medline, Embase, Scopus, IEEE/IET Electronic Library, ISI Web of Science, Latindex and LILACS databases for articles that include the diagnostic evaluation of acute coronary syndromes using intelligent systems. The review process was conducted independently by 2 reviewers, and discrepancies were resolved through the participation of a third person. The operational characteristics of the studied tools were extracted. A total of 35 references met the inclusion criteria. In 22 (62.8%) cases, neural networks were used. In five studies, the performances of several intelligent systems tools were compared. Thirteen studies sought to perform diagnoses of all acute coronary syndromes, and in 22, only infarctions were studied. In 21 cases, clinical and electrocardiographic aspects were used as input data, and in 10, only electrocardiographic data were used. Most intelligent systems use the clinical context as a reference standard. High rates of diagnostic accuracy were found with better performance using neural networks and support vector machines, compared with statistical tools of pattern recognition and decision trees. Extensive evidence was found that shows that using intelligent systems tools achieves a greater degree of accuracy than some clinical algorithms or scales and, thus, should be considered appropriate tools for supporting diagnostic decisions of acute coronary syndromes. Copyright © 2017 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  18. Investigating AI with Basic and Logo. Teaching Your Computer to Be Intelligent.

    Science.gov (United States)

    Mandell, Alan; Lucking, Robert

    1988-01-01

    Discusses artificial intelligence, its definitions, and potential applications. Provides listings of Logo and BASIC versions for programs along with REM statements needed to make modifications for use with Apple computers. (RT)

  19. The Intelligent Safety System: could it introduce complex computing into CANDU shutdown systems

    International Nuclear Information System (INIS)

    Hall, J.A.; Hinds, H.W.; Pensom, C.F.; Barker, C.J.; Jobse, A.H.

    1984-07-01

    The Intelligent Safety System is a computerized shutdown system being developed at the Chalk River Nuclear Laboratories (CRNL) for future CANDU nuclear reactors. It differs from current CANDU shutdown systems in both the algorithm used and the size and complexity of computers required to implement the concept. This paper provides an overview of the project, with emphasis on the computing aspects. Early in the project several needs leading to an introduction of computing complexity were identified, and a computing system that met these needs was conceived. The current work at CRNL centers on building a laboratory demonstration of the Intelligent Safety System, and evaluating the reliability and testability of the concept. Some fundamental problems must still be addressed for the Intelligent Safety System to be acceptable to a CANDU owner and to the regulatory authorities. These are also discussed along with a description of how the Intelligent Safety System might solve these problems

  20. iPat: intelligent prediction and association tool for genomic research.

    Science.gov (United States)

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  1. Computational Tools for RF Structure Design

    CERN Document Server

    Jensen, E

    2004-01-01

    The Finite Differences Method and the Finite Element Method are the two principally employed numerical methods in modern RF field simulation programs. The basic ideas behind these methods are explained, with regard to available simulation programs. We then go through a list of characteristic parameters of RF structures, explaining how they can be calculated using these tools. With the help of these parameters, we introduce the frequency-domain and the time-domain calculations, leading to impedances and wake-fields, respectively. Subsequently, we present some readily available computer programs, which are in use for RF structure design, stressing their distinctive features and limitations. One final example benchmarks the precision of different codes for calculating the eigenfrequency and Q of a simple cavity resonator.

  2. Entheogens and Existential Intelligence: The Use of Plant Teachers as Cognitive Tools

    Science.gov (United States)

    Tupper, Kenneth W.

    2002-01-01

    In light of recent specific liberalizations in drug laws in some countries, I have investigated the potential of entheogens (i.e., psychoactive plants used as spiritual sacraments) as tools to facilitate existential intelligence. "Plant teachers" from the Americas such as ayahuasca, psilocybin mushrooms, and peyote, and the Indo-Aryan…

  3. A modern artificial intelligence Playware art tool for psychological testing of group dynamics

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2015-01-01

    and the psychological findings. We describe the modern artificial intelligence implementation of this instrument. Between an art piece and a psychological test, at a first cognitive analysis, it seems to be a promising research tool. In the discussion we speculate about potential industrial applications, as well....

  4. VISTA - computational tools for comparative genomics

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  5. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  6. The Impact of Business Intelligence Tools on Performance: A User Satisfaction Paradox?

    OpenAIRE

    Bernhard Wieder; Maria-Luise Ossimitz; Peter Chamoni

    2012-01-01

    While Business Intelligence (BI) initiatives have been a top-priority of CIOs around the world for several years, accounting for billions of USD of IT investments per annum (IDC), academic research on the actual benefits derived from BI tools and the drivers of these benefits remain sparse. This paper reports the findings of an exploratory, cross-sectional field study investigating the factors that define and drive benefits associated with the deployment of dedicated BI tools. BI is broadly d...

  7. A least-squares computational ''tool kit''

    International Nuclear Information System (INIS)

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ''tool kit'' to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications

  8. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology...... is used for checking the consistency of a design with respect to the availablity of services and resources. In the second application, a tool for automatically implementing the communication infrastructure of a process network application, the Service Relation Model is used for analyzing the capabilities...

  9. Knowledge Based Artificial Augmentation Intelligence Technology: Next Step in Academic Instructional Tools for Distance Learning

    Science.gov (United States)

    Crowe, Dale; LaPierre, Martin; Kebritchi, Mansureh

    2017-01-01

    With augmented intelligence/knowledge based system (KBS) it is now possible to develop distance learning applications to support both curriculum and administrative tasks. Instructional designers and information technology (IT) professionals are now moving from the programmable systems era that started in the 1950s to the cognitive computing era.…

  10. An Intelligent Tutor for Intrusion Detection on Computer Systems

    National Research Council Canada - National Science Library

    Rowe, Neil C; Schiavo, Sandra

    1998-01-01

    ... critical. We describe a tutor incorporating two programs. The first program uses artificial-intelligence planning methods to generate realistic audit files reporting actions of a variety of simulated users (including intruders...

  11. Snap-drift neural computing for intelligent diagnostic feedback

    OpenAIRE

    Habte, Samson

    2017-01-01

    Information and communication technologies have been playing a crucial role in improving the efficiency and effectiveness of learning and teaching in higher education. Two decades ago, research studies were focused on how to use artificial intelligence techniques to imitate teachers or tutors in delivering learning sessions. Machine learning techniques have been applied in several research studies to construct a student model in the context of intelligent tutoring systems. However, the usage ...

  12. Methods of Computational Intelligence in the Context of Quality Assurance in Foundry Products

    Directory of Open Access Journals (Sweden)

    Rojek G.

    2016-06-01

    Full Text Available One way to ensure the required technical characteristics of castings is the strict control of production parameters affecting the quality of the finished products. If the production process is improperly configured, the resulting defects in castings lead to huge losses. Therefore, from the point of view of economics, it is advisable to use the methods of computational intelligence in the field of quality assurance and adjustment of parameters of future production. At the same time, the development of knowledge in the field of metallurgy, aimed to raise the technical level and efficiency of the manufacture of foundry products, should be followed by the development of information systems to support production processes in order to improve their effectiveness and compliance with the increasingly more stringent requirements of ergonomics, occupational safety, environmental protection and quality. This article is a presentation of artificial intelligence methods used in practical applications related to quality assurance. The problem of control of the production process involves the use of tools such as the induction of decision trees, fuzzy logic, rough set theory, artificial neural networks or case-based reasoning.

  13. Prediction of pork loin quality using online computer vision system and artificial intelligence model.

    Science.gov (United States)

    Sun, Xin; Young, Jennifer; Liu, Jeng-Hung; Newman, David

    2018-06-01

    The objective of this project was to develop a computer vision system (CVS) for objective measurement of pork loin under industry speed requirement. Color images of pork loin samples were acquired using a CVS. Subjective color and marbling scores were determined according to the National Pork Board standards by a trained evaluator. Instrument color measurement and crude fat percentage were used as control measurements. Image features (18 color features; 1 marbling feature; 88 texture features) were extracted from whole pork loin color images. Artificial intelligence prediction model (support vector machine) was established for pork color and marbling quality grades. The results showed that CVS with support vector machine modeling reached the highest prediction accuracy of 92.5% for measured pork color score and 75.0% for measured pork marbling score. This research shows that the proposed artificial intelligence prediction model with CVS can provide an effective tool for predicting color and marbling in the pork industry at online speeds. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Embedding Topical Elements of Parallel Programming, Computer Graphics, and Artificial Intelligence across the Undergraduate CS Required Courses

    Directory of Open Access Journals (Sweden)

    James Wolfer

    2015-02-01

    Full Text Available Traditionally, topics such as parallel computing, computer graphics, and artificial intelligence have been taught as stand-alone courses in the computing curriculum. Often these are elective courses, limiting the material to the subset of students choosing to take the course. Recently there has been movement to distribute topics across the curriculum in order to ensure that all graduates have been exposed to concepts such as parallel computing. Previous work described an attempt to systematically weave a tapestry of topics into the undergraduate computing curriculum. This paper reviews that work and expands it with representative examples of assignments, demonstrations, and results as well as describing how the tools and examples deployed for these classes have a residual effect on classes such as Comptuer Literacy.

  15. Development and evaluation of intelligent machine tools based on knowledge evolution in M2M environment

    International Nuclear Information System (INIS)

    Kim, Dong Hoon; Song, Jun Yeob; Lee, Jong Hyun; Cha, Suk Keun

    2009-01-01

    In the near future, the foreseen improvement in machine tools will be in the form of a knowledge evolution-based intelligent device. The goal of this study is to develop intelligent machine tools having knowledge-evolution capability in Machine to Machine (M2M) wired and wireless environment. The knowledge evolution-based intelligent machine tools are expected to be capable of gathering knowledge autonomously, producing knowledge, understanding knowledge, applying reasoning to knowledge, making new decisions, dialoguing with other machines, etc. The concept of the knowledge-evolution intelligent machine originated from the process of machine control operation by the sense, dialogue and decision of a human expert. The structure of knowledge evolution in M2M and the scheme for a dialogue agent among agent-based modules such as a sensory agent, a dialogue agent and an expert system (decision support agent) are presented in this paper, and work-offset compensation from thermal change and recommendation of cutting condition are performed on-line for knowledge-evolution verification

  16. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  17. Artificial intelligence tools decision support systems in condition monitoring and diagnosis

    CERN Document Server

    Galar Pascual, Diego

    2015-01-01

    Artificial Intelligence Tools: Decision Support Systems in Condition Monitoring and Diagnosis discusses various white- and black-box approaches to fault diagnosis in condition monitoring (CM). This indispensable resource: Addresses nearest-neighbor-based, clustering-based, statistical, and information theory-based techniques Considers the merits of each technique as well as the issues associated with real-life application Covers classification methods, from neural networks to Bayesian and support vector machines Proposes fuzzy logic to explain the uncertainties associated with diagnostic processes Provides data sets, sample signals, and MATLAB® code for algorithm testing Artificial Intelligence Tools: Decision Support Systems in Condition Monitoring and Diagnosis delivers a thorough evaluation of the latest AI tools for CM, describing the most common fault diagnosis techniques used and the data acquired when these techniques are applied.

  18. CRISP. Requirements Specifications of Intelligent ICT Simulation Tools for Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, C.J.; Kester, J.C.P.; Kamphuis, I.G. [ECN Energy in the Built Environment and Networks, Petten (Netherlands); Carlsson, P [EnerSearch, Malmoe (Sweden); Fontela, M. [Laboratory of Electrical Engineering LEG, Grenoble (France); Gustavsson, R. [Blekinge Institute of Technology BTH, Karlskrona (Sweden)

    2003-10-15

    This report, deliverable D2.1 in the CRISP project, serves as a preparation report for the development of simulation tools and prototype software which will be developed in forthcoming stages of the CRISP project. Application areas for these simulations are: fault detection and diagnosis, supply and demand matching and intelligent load shedding. The context in which these applications function is the power network with a high degree of distributed generation, including renewables. In order to control a so called distributed grid we can benefit from a high level of distributed control and intelligence. This requires, on top of the power system network, an information and communication network.. We argue that such a network should be seen as an enabler of distributed control and intelligence. The applications, through which control and intelligence is implemented, then form a third network layer, the service oriented network. Building upon this three-layered network model we derive in this report the requirements for a simulation tool and experiments which study new techniques for fault detection and diagnostics and for simulation tools and experiments implementing intelligent load shedding and supply and demand matching scenarios. We also look at future implementation of these services within the three-layered network model and the requirements that follow for the core information and communication network and for the service oriented network. These requirements, supported by the studies performed in the CRISP Workpackage 1, serve as a basis for development of the simulation tools in the tasks 2.2 to 2.4.

  19. CRISP. Requirements Specifications of Intelligent ICT Simulation Tools for Power Applications

    International Nuclear Information System (INIS)

    Warmer, C.J.; Kester, J.C.P.; Kamphuis, I.G.; Carlsson, P; Fontela, M.; Gustavsson, R.

    2003-10-01

    This report, deliverable D2.1 in the CRISP project, serves as a preparation report for the development of simulation tools and prototype software which will be developed in forthcoming stages of the CRISP project. Application areas for these simulations are: fault detection and diagnosis, supply and demand matching and intelligent load shedding. The context in which these applications function is the power network with a high degree of distributed generation, including renewables. In order to control a so called distributed grid we can benefit from a high level of distributed control and intelligence. This requires, on top of the power system network, an information and communication network.. We argue that such a network should be seen as an enabler of distributed control and intelligence. The applications, through which control and intelligence is implemented, then form a third network layer, the service oriented network. Building upon this three-layered network model we derive in this report the requirements for a simulation tool and experiments which study new techniques for fault detection and diagnostics and for simulation tools and experiments implementing intelligent load shedding and supply and demand matching scenarios. We also look at future implementation of these services within the three-layered network model and the requirements that follow for the core information and communication network and for the service oriented network. These requirements, supported by the studies performed in the CRISP Workpackage 1, serve as a basis for development of the simulation tools in the tasks 2.2 to 2.4

  20. Visual intelligence Microsoft tools and techniques for visualizing data

    CERN Document Server

    Stacey, Mark; Jorgensen, Adam

    2013-01-01

    Go beyond design concepts and learn to build state-of-the-art visualizations The visualization experts at Microsoft's Pragmatic Works have created a full-color, step-by-step guide to building specific types of visualizations. The book thoroughly covers the Microsoft toolset for data analysis and visualization, including Excel, and explores best practices for choosing a data visualization design, selecting tools from the Microsoft stack, and building a dynamic data visualization from start to finish. You'll examine different types of visualizations, their strengths and weaknesses, a

  1. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  2. Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.

    Science.gov (United States)

    2018-01-04

    An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.

  3. Application of computational intelligence techniques for load shedding in power systems: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Bakar, A.H.A.; Mohamad, Hasmaini

    2013-01-01

    Highlights: • The power system blackout history of last two decades is presented. • Conventional load shedding techniques, their types and limitations are presented. • Applications of intelligent techniques in load shedding are presented. • Intelligent techniques include ANN, fuzzy logic, ANFIS, genetic algorithm and PSO. • The discussion and comparison between these techniques are provided. - Abstract: Recent blackouts around the world question the reliability of conventional and adaptive load shedding techniques in avoiding such power outages. To address this issue, reliable techniques are required to provide fast and accurate load shedding to prevent collapse in the power system. Computational intelligence techniques, due to their robustness and flexibility in dealing with complex non-linear systems, could be an option in addressing this problem. Computational intelligence includes techniques like artificial neural networks, genetic algorithms, fuzzy logic control, adaptive neuro-fuzzy inference system, and particle swarm optimization. Research in these techniques is being undertaken in order to discover means for more efficient and reliable load shedding. This paper provides an overview of these techniques as applied to load shedding in a power system. This paper also compares the advantages of computational intelligence techniques over conventional load shedding techniques. Finally, this paper discusses the limitation of computational intelligence techniques, which restricts their usage in load shedding in real time

  4. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  5. Automatic welding detection by an intelligent tool pipe inspection

    Science.gov (United States)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  6. The intelligent clinical laboratory as a tool to increase cancer care management productivity.

    Science.gov (United States)

    Mohammadzadeh, Niloofar; Safdari, Reza

    2014-01-01

    Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligent clinical laboratory as a tool to increase cancer care management productivity.

  7. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  8. 6th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 16th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2015) which was held on June 1 – 3, 2015 in Takamatsu, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  9. 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    SNPD 2016

    2016-01-01

    This edited book presents scientific results of the 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2016) which was held on May 30 - June 1, 2016 in Shanghai, China. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  10. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  11. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  12. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  13. Intelligent front-end sample preparation tool using acoustic streaming.

    Energy Technology Data Exchange (ETDEWEB)

    Cooley, Erika J.; McClain, Jaime L.; Murton, Jaclyn K.; Edwards, Thayne L.; Achyuthan, Komandoor E.; Branch, Darren W.; Clem, Paul Gilbert; Anderson, John Mueller; James, Conrad D.; Smith, Gennifer; Kotulski, Joseph Daniel

    2009-09-01

    We have successfully developed a nucleic acid extraction system based on a microacoustic lysis array coupled to an integrated nucleic acid extraction system all on a single cartridge. The microacoustic lysing array is based on 36{sup o} Y cut lithium niobate, which couples bulk acoustic waves (BAW) into the microchannels. The microchannels were fabricated using Mylar laminates and fused silica to form acoustic-fluidic interface cartridges. The transducer array consists of four active elements directed for cell lysis and one optional BAW element for mixing on the cartridge. The lysis system was modeled using one dimensional (1D) transmission line and two dimensional (2D) FEM models. For input powers required to lyse cells, the flow rate dictated the temperature change across the lysing region. From the computational models, a flow rate of 10 {micro}L/min produced a temperature rise of 23.2 C and only 6.7 C when flowing at 60 {micro}L/min. The measured temperature changes were 5 C less than the model. The computational models also permitted optimization of the acoustic coupling to the microchannel region and revealed the potential impact of thermal effects if not controlled. Using E. coli, we achieved a lysing efficacy of 49.9 {+-} 29.92 % based on a cell viability assay with a 757.2 % increase in ATP release within 20 seconds of acoustic exposure. A bench-top lysing system required 15-20 minutes operating up to 58 Watts to achieve the same level of cell lysis. We demonstrate that active mixing on the cartridge was critical to maximize binding and release of nucleic acid to the magnetic beads. Using a sol-gel silica bead matrix filled microchannel the extraction efficacy was 40%. The cartridge based magnetic bead system had an extraction efficiency of 19.2%. For an electric field based method that used Nafion films, a nucleic acid extraction efficiency of 66.3 % was achieved at 6 volts DC. For the flow rates we tested (10-50 {micro}L/min), the nucleic acid extraction

  14. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  15. Artificial Intelligence: Realizing the Ultimate Promises of Computing

    OpenAIRE

    Waltz, David L.

    1997-01-01

    Artificial intelligence (AI) is the key technology in many of today's novel applications, ranging from banking systems that detect attempted credit card fraud, to telephone systems that understand speech, to software systems that notice when you're having problems and offer appropriate advice. These technologies would not exist today without the sustained federal support of fundamental AI research over the past three decades.

  16. Hidden Hearing Loss and Computational Models of the Auditory Pathway: Predicting Speech Intelligibility Decline

    Science.gov (United States)

    2016-11-28

    Title: Hidden Hearing Loss and Computational Models of the Auditory Pathway: Predicting Speech Intelligibility Decline Christopher J. Smalt...representation of speech intelligibility in noise. The auditory-periphery model of Zilany et al. (JASA 2009,2014) is used to make predictions of...auditory nerve (AN) responses to speech stimuli under a variety of difficult listening conditions. The resulting cochlear neurogram, a spectrogram

  17. Coupling artificial intelligence and numerical computation for engineering design (Invited paper)

    Science.gov (United States)

    Tong, S. S.

    1986-01-01

    The possibility of combining artificial intelligence (AI) systems and numerical computation methods for engineering designs is considered. Attention is given to three possible areas of application involving fan design, controlled vortex design of turbine stage blade angles, and preliminary design of turbine cascade profiles. Among the AI techniques discussed are: knowledge-based systems; intelligent search; and pattern recognition systems. The potential cost and performance advantages of an AI-based design-generation system are discussed in detail.

  18. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (UNIX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  19. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (DEC VAX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  20. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  1. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  2. Prospective EFL Teachers' Emotional Intelligence and Tablet Computer Use and Literacy

    Science.gov (United States)

    Herguner, Sinem

    2017-01-01

    The aim of this study was to investigate whether there is a relationship between tablet computer use and literacy, and emotional intelligence of prospective English language teachers. The study used a survey approach. In the study, "Prospective Teachers Tablet Computer Use and Literacy Scale" and an adapted and translated version into…

  3. Intelligent Electric Power Systems with Active-Adaptive Electric Networks: Challenges for Simulation Tools

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2015-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.

  4. Identifying Key Features, Cutting Edge Cloud Resources, and Artificial Intelligence Tools to Achieve User-Friendly Water Science in the Cloud

    Science.gov (United States)

    Pierce, S. A.

    2017-12-01

    Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case

  5. Soft Computing Optimizer For Intelligent Control Systems Design: The Structure And Applications

    Directory of Open Access Journals (Sweden)

    Sergey A. Panfilov

    2003-10-01

    Full Text Available Soft Computing Optimizer (SCO as a new software tool for design of robust intelligent control systems is described. It is based on the hybrid methodology of soft computing and stochastic simulation. It uses as an input the measured or simulated data about the modeled system. SCO is used to design an optimal fuzzy inference system, which approximates a random behavior of control object with the certain accuracy. The task of the fuzzy inference system construction is reduced to the subtasks such as forming of the linguistic variables for each input and output variable, creation of rule data base, optimization of rule data base and refinement of the parameters of the membership functions. Each task by the corresponding genetic algorithm (with an appropriate fitness function is solved. The result of SCO application is the design of Knowledge Base of a Fuzzy Controller, which contains the value information about developed fuzzy inference system. Such value information can be downloaded into the actual fuzzy controller to perform online fuzzy control. Simulations results of robust fuzzy control of nonlinear dynamic systems and experimental results of application on automotive semi-active suspension control are demonstrated.

  6. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    Science.gov (United States)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  7. Intelligent Agent Based Semantic Web in Cloud Computing Environment

    OpenAIRE

    Mukhopadhyay, Debajyoti; Sharma, Manoj; Joshi, Gajanan; Pagare, Trupti; Palwe, Adarsha

    2013-01-01

    Considering today's web scenario, there is a need of effective and meaningful search over the web which is provided by Semantic Web. Existing search engines are keyword based. They are vulnerable in answering intelligent queries from the user due to the dependence of their results on information available in web pages. While semantic search engines provides efficient and relevant results as the semantic web is an extension of the current web in which information is given well defined meaning....

  8. Artificial intelligence programming languages for computer aided manufacturing

    Science.gov (United States)

    Rieger, C.; Samet, H.; Rosenberg, J.

    1979-01-01

    Eight Artificial Intelligence programming languages (SAIL, LISP, MICROPLANNER, CONNIVER, MLISP, POP-2, AL, and QLISP) are presented and surveyed, with examples of their use in an automated shop environment. Control structures are compared, and distinctive features of each language are highlighted. A simple programming task is used to illustrate programs in SAIL, LISP, MICROPLANNER, and CONNIVER. The report assumes reader knowledge of programming concepts, but not necessarily of the languages surveyed.

  9. Creating Innovative Solutions for Future Hotel Rooms with Intelligent Multimedia and Pervasive Computing

    Science.gov (United States)

    Sharda, Nalin K.

    Pervasive computing and intelligent multimedia technologies are becoming increasingly important to the modern way of living. However, many of their potential applications have not been fully realized yet. This chapter explores how innovative applications can be developed to meet the needs of the next generation hotels. Futuristic hotel rooms aim to be more than “home-away-from-home,” and as a consequence, offer tremendous opportunities for developing innovative applications of pervasive computing and intelligent multimedia. Next generation hotels will make increased use of technology products to attract new customers. High end TV screens, changeable room ambiance, biometric guest recognition, and electronic check-in facilities are some of the features already being implemented by some hotels. Entirely futuristic hotels in the sea, the stratosphere or the outer space, are also being proposed. All of these provide many novel opportunities for developing innovative solutions using intelligent multimedia and ubiquitous computing.

  10. Intelligent Systems For Aerospace Engineering: An Overview

    Science.gov (United States)

    KrishnaKumar, K.

    2003-01-01

    Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.

  11. Intelligent computer aided training systems in the real world: Making the technology accessible to the educational mainstream

    Science.gov (United States)

    Kovarik, Madeline

    1993-01-01

    Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.

  12. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  13. Computing tools for accelerator design calculations

    International Nuclear Information System (INIS)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  14. Errors and Intelligence in Computer-Assisted Language Learning: Parsers and Pedagogues. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Heift, Trude; Schulze, Mathias

    2012-01-01

    This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…

  15. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    Science.gov (United States)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  16. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  17. Clinical Data Warehouse: An Effective Tool to Create Intelligence in Disease Management.

    Science.gov (United States)

    Karami, Mahtab; Rahimi, Azin; Shahmirzadi, Ali Hosseini

    Clinical business intelligence tools such as clinical data warehouse enable health care organizations to objectively assess the disease management programs that affect the quality of patients' life and well-being in public. The purpose of these programs is to reduce disease occurrence, improve patient care, and decrease health care costs. Therefore, applying clinical data warehouse can be effective in generating useful information about aspects of patient care to facilitate budgeting, planning, research, process improvement, external reporting, benchmarking, and trend analysis, as well as to enable the decisions needed to prevent the progression or appearance of the illness aligning with maintaining the health of the population. The aim of this review article is to describe the benefits of clinical data warehouse applications in creating intelligence for disease management programs.

  18. Intelligent battery energy management and control for vehicle-to-grid via cloud computing network

    International Nuclear Information System (INIS)

    Khayyam, Hamid; Abawajy, Jemal; Javadi, Bahman; Goscinski, Andrzej; Stojcevski, Alex; Bab-Hadiashar, Alireza

    2013-01-01

    Highlights: • The intelligent battery energy management substantially reduces the interactions of PEV with parking lots. • The intelligent battery energy management improves the energy efficiency. • The intelligent battery energy management predicts the road load demand for vehicles. - Abstract: Plug-in Electric Vehicles (PEVs) provide new opportunities to reduce fuel consumption and exhaust emission. PEVs need to draw and store energy from an electrical grid to supply propulsive energy for the vehicle. As a result, it is important to know when PEVs batteries are available for charging and discharging. Furthermore, battery energy management and control is imperative for PEVs as the vehicle operation and even the safety of passengers depend on the battery system. Thus, scheduling the grid power electricity with parking lots would be needed for efficient charging and discharging of PEV batteries. This paper aims to propose a new intelligent battery energy management and control scheduling service charging that utilize Cloud computing networks. The proposed intelligent vehicle-to-grid scheduling service offers the computational scalability required to make decisions necessary to allow PEVs battery energy management systems to operate efficiently when the number of PEVs and charging devices are large. Experimental analyses of the proposed scheduling service as compared to a traditional scheduling service are conducted through simulations. The results show that the proposed intelligent battery energy management scheduling service substantially reduces the required number of interactions of PEV with parking lots and grid as well as predicting the load demand calculated in advance with regards to their limitations. Also it shows that the intelligent scheduling service charging using Cloud computing network is more efficient than the traditional scheduling service network for battery energy management and control

  19. Computer algebra as a research tool in physics

    International Nuclear Information System (INIS)

    Drouffe, J.M.

    1985-04-01

    The progress of computer algebra observed during these last years has had certainly an impact in physics. I want to precise the role of these new techniques in this application domain and to analyze their present limitations. In Section 1, I describe briefly the use of algebraic manipulation programs at the elementary level. The numerical and symbolic solutions of problems are compared in Section 2. Section 3 is devoted to a prospective about the use of computer algebra at the highest level, as an ''intelligent'' system. I recall in Section 4 what is required from a system to be used in physics

  20. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  1. An Intelligent Computer-aided Training System (CAT) for Diagnosing Adult Illiterates: Integrating NASA Technology into Workplace Literacy

    Science.gov (United States)

    Yaden, David B., Jr.

    1991-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application being developed is The Adult Literacy Evaluator, a simulation-based diagnostic tool designed to assess the operant literacy abilities of adults having difficulties in learning to read and write. Using Intelligent Computer-Aided Training (ICAT) system technology in addition to speech recognition, closed-captioned television (CCTV), live video and other state-of-the-art graphics and storage capabilities, this project attempts to overcome the negative effects of adult literacy assessment by allowing the client to interact with an intelligent computer system which simulates real-life literacy activities and materials and which measures literacy performance in the actual context of its use. The specific objectives of the project are as follows: (1) to develop a simulation-based diagnostic tool to assess adults' prior knowledge about reading and writing processes in actual contexts of application; (2) to provide a profile of readers' strengths and weaknesses; and (3) to suggest instructional strategies and materials which can be used as a beginning point for remediation. In the first and development phase of the project, descriptions of literacy events and environments are being written and functional literacy documents analyzed for their components. From these descriptions, scripts are being generated which define the interaction between the student, an on-screen guide and the simulated literacy environment.

  2. Computational Tools for Stem Cell Biology.

    Science.gov (United States)

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    Science.gov (United States)

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  4. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  5. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  6. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  7. A Multidisciplinary Model for Development of Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Park, Ok-choon; Seidel, Robert J.

    1989-01-01

    Proposes a schematic multidisciplinary model to help developers of intelligent computer-assisted instruction (ICAI) identify the types of required expertise and integrate them into a system. Highlights include domain types and expertise; knowledge acquisition; task analysis; knowledge representation; student modeling; diagnosis of learning needs;…

  8. Foreword 3rd International Conference on Affective Computing and Intelligent Interaction - ACII 2009

    NARCIS (Netherlands)

    Cohn, Jeffrey; Cohn, Jeffrey; Nijholt, Antinus; Pantic, Maja

    2009-01-01

    It is a pleasure and an honor to have organized the Third International Conference on Affective Computing and Intelligent Interaction (ACII). The conference will be held from 10th – 12th September 2009 in Amsterdam, The Netherlands. The conference series is the premier forum for presenting research

  9. Artificial intelligence and other innovative computer applications in the nuclear industry

    International Nuclear Information System (INIS)

    Majumdar, M.C.; Majumdar, D.; Sackett, J.I.

    1987-01-01

    This book reviews the applications of artificial intelligence and computers in the nuclear industry and chemical plants. The topics discussed are: Robots applications and reliability in maintenance of nuclear power plants; Advanced information technology and expert systems; Knowledge base alarm systems; Emergency planning and response of accidents; and reactor safety assessment

  10. Price Comparisons on the Internet Based on Computational Intelligence

    Science.gov (United States)

    Kim, Jun Woo; Ha, Sung Ho

    2014-01-01

    Information-intensive Web services such as price comparison sites have recently been gaining popularity. However, most users including novice shoppers have difficulty in browsing such sites because of the massive amount of information gathered and the uncertainty surrounding Web environments. Even conventional price comparison sites face various problems, which suggests the necessity of a new approach to address these problems. Therefore, for this study, an intelligent product search system was developed that enables price comparisons for online shoppers in a more effective manner. In particular, the developed system adopts linguistic price ratings based on fuzzy logic to accommodate user-defined price ranges, and personalizes product recommendations based on linguistic product clusters, which help online shoppers find desired items in a convenient manner. PMID:25268901

  11. Advances in Intelligent Control Systems and Computer Science

    CERN Document Server

    2013-01-01

    The conception of real-time control networks taking into account, as an integrating approach, both the specific aspects of information and knowledge processing and the dynamic and energetic particularities of physical processes and of communication networks is representing one of the newest scientific and technological challenges. The new paradigm of Cyber-Physical Systems (CPS) reflects this tendency and will certainly change the evolution of the technology, with major social and economic impact. This book presents significant results in the field of process control and advanced information and knowledge processing, with applications in the fields of robotics, biotechnology, environment, energy, transportation, et al.. It introduces intelligent control concepts and strategies as well as real-time implementation aspects for complex control approaches. One of the sections is dedicated to the complex problem of designing software systems for distributed information processing networks. Problems as complexity an...

  12. Designing an Intelligent Mobile Learning Tool for Grammar Learning (i-MoL

    Directory of Open Access Journals (Sweden)

    Munir Shuib

    2015-01-01

    Full Text Available English is the most important second language in most non-English speaking countries, including Malaysia. A good English proficiency comes from good grasp of grammar. To conquer the problems of low English proficiency among Malaysians, it is important to identify the key motivators that could facilitate the process of grammar learning. In this digital age, technology can play a very important role and mobile technology could be one of it. Thus, this study aims at designing a mobile learning tool, namely the Intelligent Mobile Learning Tool for Grammar Learning (i-MoL to act as the “on-the-go” grammar learning support via mobile phones. i-MoL helps reinforce grammar learning through mobile phone with game-like applications, inquiry-based activities and flashcard-like information. The intelligent part of i-MoL lies in its ability to map the mobile-based grammar learning content to individual’s preferred learning styles based on Felder-Silverman Learning Style Model (FSLSM. The instructional system design through the ADDIE model was used in this study as a systematic approach in designing a novel and comprehensive mobile learning tool for grammar learning. In terms of implications, this study provides insights on how mobile technologies can be utilized to meet the mobility demand among language learners today.

  13. New evaluation methods for conceptual design selection using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai [University of Electronic Science and Technology of China, Chengdu (China); Xue, Lihua [Higher Education Press, Beijing (China)

    2013-03-15

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  14. New evaluation methods for conceptual design selection using computational intelligence techniques

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai; Xue, Lihua

    2013-01-01

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  15. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  16. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  17. A Business intelligence tool for studying value co-creation and innovation

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Ruskov, Petko; Georgiev, Lachezar

    2011-01-01

    Value co-creation is an emerging marketing and innovation paradigm describing a broader opening of the firm to its customers by providing them with the opportunity to become active participants in the design and development of personalized products, services and experiences. However......, there is not yet a fully satisfactory theoretical vision about its distinctive characteristics as compared to more traditional value creation approaches. One of the challenges in studying value co-creation is the lack of business intelligence (BI) tools that can be used in the conceptualization of value co...... is the relationship between the degree of firms’ involvement in value co-creation activities and their innovativeness....

  18. Online multiple intelligence teaching tools (On-MITT) for enhancing interpersonal teaching activities

    Science.gov (United States)

    Mohamad, Siti Nurul Mahfuzah; Salam, Sazilah; Bakar, Norasiken; Sui, Linda Khoo Mei

    2014-07-01

    The theories of Multiple Intelligence (MI) used in this paper apply to students with interpersonal intelligence who is encouraged to work together in cooperative groups where interpersonal interaction is practiced. In this context, students used their knowledge and skills to help the group or partner to complete the tasks given. Students can interact with each other as they learn and the process of learning requires their verbal and non-verbal communication skills, co-operation and empathy in the group. Meanwhile educators can incorporate cooperative learning in groups in the classroom. On-MITT provides various tools to facilitate lecturers in preparing e-content that applies interpersonal intelligence. With minimal knowledge of Information and Technology (IT) skills, educators can produce creative and interesting teaching activities and teaching materials. The objective of this paper is to develop On-MITT prototype for interpersonal teaching activities. This paper addressed initial prototype of this study. An evaluation of On-MITT has been completed by 20 lecturers of Malaysian Polytechnics. Motivation Survey Questionnaire is used as the instrument to measure four motivation variables: ease of use, enjoyment, usefulness and self-confidence. Based on the findings, the On-MITT can facilitate educators to prepare teaching materials that are compatible for interpersonal learner.

  19. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  20. Computational Tools applied to Urban Engineering

    OpenAIRE

    Filho, Armando Carlos de Pina; Lima, Fernando Rodrigues; Amaral, Renato Dias Calado do

    2010-01-01

    This chapter looked for to present the main details on three technologies much used in Urban Engineering: CAD (Computer-Aided Design); GIS (Geographic Information System); and BIM (Building Information Modelling). As it can be seen, each one of them presents specific characteristics and with diverse applications in urban projects, providing better results in relation to the planning, management and maintenance of the systems. In relation to presented software, it is important to note that the...

  1. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  2. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  3. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  4. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    International Nuclear Information System (INIS)

    Rivera, A.L.; Singh, S.P.N.; Ferrada, J.J.

    1991-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab

  5. Editorial: Computational Creativity, Concept Invention, and General Intelligence

    Science.gov (United States)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Veale, Tony

    2015-12-01

    Over the last decade, computational creativity as a field of scientific investigation and computational systems engineering has seen growing popularity. Still, the levels of development between projects aiming at systems for artistic production or performance and endeavours addressing creative problem-solving or models of creative cognitive capacities is diverging. While the former have already seen several great successes, the latter still remain in their infancy. This volume collects reports on work trying to close the accrued gap.

  6. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  7. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  8. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  9. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  10. Computational tool for postoperative evaluation of cochlear implant patients

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Pavan, Ana Luiza M.; Pina, Diana R. de; Altemani, Joao M.C.; Castilho, Arthur M.

    2016-01-01

    The aim of this study was to develop a tool to calculate the insertion depth angle of cochlear implants, from computed tomography exams. The tool uses different image processing techniques, such as thresholding and active contour. Then, we compared the average insertion depth angle of three different implant manufacturers. The developed tool can be used, in the future, to compare the insertion depth angle of the cochlear implant with postoperative response of patient's hearing. (author)

  11. Competitive intelligence tools used by small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Tshilidzi Eric Nenzhelele

    2015-08-01

    Full Text Available Small and Medium-sized Enterprises (SMEs contribute highly to the gross domestic product, reduction in unemployment, wealth creation and improvement in the quality of life. Due to technology and globalisation, SMEs now compete with enterprises beyond the borders of their country. To survive in this global competitive business environment, SMEs seek for tools that offer competitive advantage. Competitive Intelligence (CI provides competitive advantage to enterprises that practice it. While CI practice has been widely research for larger enterprises, there is lack of research on CI practice pertaining to SMEs. This research establishes tools used by SMEs in CI practice. The research was quantitative in nature and a self-administered questionnaire was used to collected data from owners/managers of SMEs

  12. The Virtual UNICOS Process Expert: integration of Artificial Intelligence tools in Control Systems

    CERN Multimedia

    Vilches Calvo, I; Barillere, R

    2009-01-01

    UNICOS is a CERN framework to produce control applications. It provides operators with ways to interact with all process items from the most simple (e.g. I/O channels) to the most abstract objects (e.g. a part of the plant). This possibility of fine grain operation is particularly useful to recover from abnormal situations if operators have the required knowledge. The Virtual UNICOS Process Expert project aims at providing operators with means to handle difficult operation cases for which the intervention of process experts is usually requested. The main idea of project is to use the openness of the UNICOS-based applications to integrate tools (e.g. Artificial Intelligence tools) which will act as Process Experts to analyze complex situations, to propose and to execute smooth recovery procedures.

  13. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  14. Computer vision and imaging in intelligent transportation systems

    CERN Document Server

    Bala, Raja; Trivedi, Mohan

    2017-01-01

    Acts as a single source reference providing readers with an overview of how computer vision can contribute to the different applications in the field of road transportation. This book presents a survey of computer vision techniques related to three key broad problems in the roadway transportation domain: safety, efficiency, and law enforcement. The individual chapters present significant applications within these problem domains, each presented in a tutorial manner, describing the motivation for and benefits of the application, and a description of the state of the art.

  15. Emergence, evolution, intelligence; hydroinformatics : a study of distributed and decentralised computing using intelligent agents

    NARCIS (Netherlands)

    Babovic, V.

    1996-01-01

    The computer-controlled operating environments of such facilities as automated factories, nuclear power plants, telecommunication centres and space stations are continually becoming more complex.The situation is similar, if not even more apparent and urgent, in the case of water. Water is not only

  16. Development of a method of continuous improvement of services using the Business Intelligence tools

    Directory of Open Access Journals (Sweden)

    Svetlana V. Kulikova

    2018-01-01

    Full Text Available The purpose of the study was to develop a method of continuous improvement of services using the Business Intelligence tools.Materials and methods: the materials are used on the concept of the Deming Cycle, methods and Business Intelligence technologies, Agile methodology and SCRUM.Results: the article considers the problem of continuous improvement of services and offers solutions using methods and technologies of Business Intelligence. In this case, the purpose of this technology is to solve and make the final decision regarding what needs to be improved in the current organization of services. In other words, Business Intelligence helps the product manager to see what is hidden from the “human eye” on the basis of received and processed data. Development of a method based on the concept of the Deming Cycle and Agile methodologies, and SCRUM.The article describes the main stages of development of method based on activity of the enterprise. It is necessary to fully build the Business Intelligence system in the enterprise to identify bottlenecks and justify the need for their elimination and, in general, for continuous improvement of the services. This process is represented in the notation of DFD. The article presents a scheme for the selection of suitable agile methodologies.The proposed concept of the solution of the stated objectives, including methods of identification of problems through Business Intelligence technology, development of the system for troubleshooting and analysis of results of the introduced changes. The technical description of the project is given.Conclusion: following the work of the authors there was formed the concept of the method for the continuous improvement of the services, using the Business Intelligence technology with the specifics of the enterprises, offering SaaS solutions. It was also found that when using this method, the recommended development methodology is SCRUM. The result of this scientific

  17. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  18. An Intelligent Computer-Based System for Sign Language Tutoring

    Science.gov (United States)

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  19. Brain-Computer Interfacing Embedded in Intelligent and Affective Systems

    NARCIS (Netherlands)

    Nijholt, Antinus

    In this talk we survey recent research views on non-traditional brain-computer interfaces (BCI). That is, interfaces that can process brain activity input, but that are designed for the ‘general population’, rather than for clinical purposes. Control of applications can be made more robust by fusing

  20. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  1. Intelligent computer systems in engineering design principles and applications

    CERN Document Server

    Sunnersjo, Staffan

    2016-01-01

    This introductory book discusses how to plan and build useful, reliable, maintainable and cost efficient computer systems for automated engineering design. The book takes a user perspective and seeks to bridge the gap between texts on principles of computer science and the user manuals for commercial design automation software. The approach taken is top-down, following the path from definition of the design task and clarification of the relevant design knowledge to the development of an operational system well adapted for its purpose. This introductory text for the practicing engineer working in industry covers most vital aspects of planning such a system. Experiences from applications of automated design systems in practice are reviewed based on a large number of real, industrial cases. The principles behind the most popular methods in design automation are presented with sufficient rigour to give the user confidence in applying them on real industrial problems. This book is also suited for a half semester c...

  2. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  3. [Emotional Intelligence Index: a tool for the routine assessment of mental health promotion programs in schools].

    Science.gov (United States)

    Veltro, Franco; Ialenti, Valentina; Morales García, Manuel Alejandro; Gigantesco, Antonella

    2016-01-01

    After critical examination of several aspects relating to the evaluation of some dimensions of emotional intelligence through self-assessment tools, is described the procedure of construction and validation of an Index for its measurement, conceived only for the routine assessment of health promotion programs mental in schools that include among their objectives the improvement of emotional intelligence specifically "outcome-oriented". On the basis of the two most common international tools, are listed 27 items plus 6 of control, illustrated two Focus Group (FG) of students (face validity). The scale obtained by FG was administered to 300 students, and the results were submitted to factorial analysis (construct validity). It was also evaluated the internal consistency with Cronbach's Alpha and studied concurrent validity with the emotional quotient inventory, a scale of perceived self-efficacy and a stress test rating. From the analysis of FG all the original items were modified, deleted 4, and reduced the encoding system from 6 to 4 levels of Likert scale. Of the 23 items included in the analysis have emerged five factors (intra-psychic dimension, interpersonal, impulsivity, adaptive coping, sense of self-efficacy) for a total of 15 items. Very satisfactory were the results of the validation process of internal consistency (0.72) and the concurrent validity. The results are positive. It is obtained in fact the shortest routine assessment tool currently available in Italy which constitutes a real Index, for which compilation are required on average 3 minutes. Is emphasized the characteristic of an Index, and not of questionnaire or interview for clinical use, highlighting the only specific use for mental health promotion programs in schools.

  4. Artificial intelligence in pharmaceutical product formulation: neural computing

    Directory of Open Access Journals (Sweden)

    Svetlana Ibrić

    2009-10-01

    Full Text Available The properties of a formulation are determined not only by the ratios in which the ingredients are combined but also by the processing conditions. Although the relationships between the ingredient levels, processing conditions, and product performance may be known anecdotally, they can rarely be quantified. In the past, formulators tended to use statistical techniques to model their formulations, relying on response surfaces to provide a mechanism for optimazation. However, the optimization by such a method can be misleading, especially if the formulation is complex. More recently, advances in mathematics and computer science have led to the development of alternative modeling and data mining techniques which work with a wider range of data sources: neural networks (an attempt to mimic the processing of the human brain; genetic algorithms (an attempt to mimic the evolutionary process by which biological systems self-organize and adapt, and fuzzy logic (an attempt to mimic the ability of the human brain to draw conclusions and generate responses based on incomplete or imprecise information. In this review the current technology will be examined, as well as its application in pharmaceutical formulation and processing. The challenges, benefits and future possibilities of neural computing will be discussed.

  5. 13th International Symposium on Distributed Computing and Artificial Intelligence 2016

    CERN Document Server

    Semalat, Ali; Bocewicz, Grzegorz; Sitek, Paweł; Nielsen, Izabela; García, Julián; Bajo, Javier

    2016-01-01

    The 13th International Symposium on Distributed Computing and Artificial Intelligence 2016 (DCAI 2016) is a forum to present applications of innovative techniques for studying and solving complex problems. The exchange of ideas between scientists and technicians from both the academic and industrial sector is essential to facilitate the development of systems that can meet the ever-increasing demands of today’s society. The present edition brings together past experience, current work and promising future trends associated with distributed computing, artificial intelligence and their application in order to provide efficient solutions to real problems. This symposium is organized by the University of Sevilla (Spain), Osaka Institute of Technology (Japan), and the Universiti Teknologi Malaysia (Malaysia).

  6. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  7. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  8. Condition Monitoring Using Computational Intelligence Methods Applications in Mechanical and Electrical Systems

    CERN Document Server

    Marwala, Tshilidzi

    2012-01-01

    Condition monitoring uses the observed operating characteristics of a machine or structure to diagnose trends in the signal being monitored and to predict the need for maintenance before a breakdown occurs. This reduces the risk, inherent in a fixed maintenance schedule, of performing maintenance needlessly early or of having a machine fail before maintenance is due either of which can be expensive with the latter also posing a risk of serious accident especially in systems like aeroengines in which a catastrophic failure would put lives at risk. The technique also measures responses from the whole of the system under observation so it can detect the effects of faults which might be hidden deep within a system, hidden from traditional methods of inspection. Condition Monitoring Using Computational Intelligence Methods promotes the various approaches gathered under the umbrella of computational intelligence to show how condition monitoring can be used to avoid equipment failures and lengthen its useful life, m...

  9. The NASA Program Management Tool: A New Vision in Business Intelligence

    Science.gov (United States)

    Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri

    2006-01-01

    This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.

  10. Automated design of analog and high-frequency circuits a computational intelligence approach

    CERN Document Server

    Liu, Bo; Fernández, Francisco V

    2014-01-01

    Computational intelligence techniques are becoming more and more important for automated problem solving nowadays. Due to the growing complexity of industrial applications and the increasingly tight time-to-market requirements, the time available for thorough problem analysis and development of tailored solution methods is decreasing. There is no doubt that this trend will continue in the foreseeable future. Hence, it is not surprising that robust and general automated problem solving methods with satisfactory performance are needed.

  11. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  12. Computational intelligence for qualitative coaching diagnostics: Automated assessment of tennis swings to improve performance and safety

    OpenAIRE

    Bačić, Boris; Hume, Patria

    2017-01-01

    Coaching technology, wearables and exergames can provide quantitative feedback based on measured activity, but there is little evidence of qualitative feedback to aid technique improvement. To achieve personalised qualitative feedback, we demonstrated a proof-of-concept prototype combining kinesiology and computational intelligence that could help improving tennis swing technique. Three-dimensional tennis motion data were acquired from multi-camera video (22 backhands and 21 forehands, includ...

  13. Analysis of Changes in Market Shares of Commercial Banks Operating in Turkey Using Computational Intelligence Algorithms

    OpenAIRE

    Amasyali, M. Fatih; Demırhan, Ayse; Bal, Mert

    2014-01-01

    This paper aims to model the change in market share of 30 domestic and foreign banks, which have been operating between the years 1990 and 2009 in Turkey by taking into consideration 20 financial ratios of those banks. Due to the fragile structure of the banking sector in Turkey, this study plays an important role for determining the changes in market share of banks and taking the necessary measures promptly. For this reason, computational intelligence methods have been used in the study. Acc...

  14. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence

    OpenAIRE

    Kiwia, D; Dehghantanha, A; Choo, K-KR; Slaughter, J

    2017-01-01

    Malware such as banking Trojans are popular with financially-motivated cybercriminals. Detection of banking Trojans remains a challenging task, due to the constant evolution of techniques used to obfuscate and circumvent existing detection and security solutions. Having a malware taxonomy can facilitate the design of mitigation strategies such as those based on evolutionary computational intelligence. Specifically, in this paper, we propose a cyber kill chain based taxonomy of banking Trojans...

  15. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  16. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  17. Wearable computer for mobile augmented-reality-based controlling of an intelligent robot

    Science.gov (United States)

    Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino

    2000-10-01

    An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.

  18. Effect of water depth on the performance of intelligent computing models in predicting wave transmission of floating pipe breakwater.

    Digital Repository Service at National Institute of Oceanography (India)

    Patil, S.G.; Mandal, S.; Hegde, A.V.

    Understanding the physics of complex system plays an important role in selection of data for training intelligent computing models. Based on the physics of the wave transmission of Horizontally Interlaced Multilayer Moored Floating Pipe Breakwater...

  19. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  20. Managing Sustainability with the Support of Business Intelligence Methods and Tools

    Science.gov (United States)

    Petrini, Maira; Pozzebon, Marlei

    In this paper we explore the role of business intelligence (BI) in helping to support the management of sustainability in contemporary firms. The concepts of sustainability and corporate social responsibility (CSR) are among the most important themes to have emerged in the last decade at the global level. We suggest that BI methods and tools have an important but not yet well studied role to play in helping organizations implement and monitor sustainable and socially responsible business practices. Using grounded theory, the main contribution of our study is to propose a conceptual model that seeks to support the process of definition and monitoring of socio-environmental indicators and the relationship between their management and business strategy.

  1. A computer tool to support in design of industrial Ethernet.

    Science.gov (United States)

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  2. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  3. Computational intelligence techniques for biological data mining: An overview

    Science.gov (United States)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  4. Brain computer interfaces as intelligent sensors for enhancing human-computer interaction

    NARCIS (Netherlands)

    Poel, M.; Nijboer, F.; Broek, E.L. van den; Fairclough, S.; Nijholt, A.

    2012-01-01

    BCIs are traditionally conceived as a way to control apparatus, an interface that allows you to act on" external devices as a form of input control. We propose an alternative use of BCIs, that of monitoring users as an additional intelligent sensor to enrich traditional means of interaction. This

  5. Brain computer interfaces as intelligent sensors for enhancing human-computer interaction

    NARCIS (Netherlands)

    Poel, Mannes; Nijboer, Femke; van den Broek, Egon; Fairclough, Stephen; Morency, Louis-Philippe; Bohus, Dan; Aghajan, Hamid; Nijholt, Antinus; Cassell, Justine; Epps, Julien

    2012-01-01

    BCIs are traditionally conceived as a way to control apparatus, an interface that allows you to "act on" external devices as a form of input control. We propose an alternative use of BCIs, that of monitoring users as an additional intelligent sensor to enrich traditional means of interaction. This

  6. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    Science.gov (United States)

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  7. A New Tool for Intelligent Parallel Processing of Radar/SAR Remotely Sensed Imagery

    Directory of Open Access Journals (Sweden)

    A. Castillo Atoche

    2013-01-01

    Full Text Available A novel parallel tool for large-scale image enhancement/reconstruction and postprocessing of radar/SAR sensor systems is addressed. The proposed parallel tool performs the following intelligent processing steps: image formation, for the application of different system-level effects of image degradation with a particular remote sensing (RS system and simulation of random noising effects, enhancement/reconstruction by employing nonparametric robust high-resolution techniques, and image postprocessing using the fuzzy anisotropic diffusion technique which incorporates a better edge-preserving noise removal effect and faster diffusion process. This innovative tool allows the processing of high-resolution images provided with different radar/SAR sensor systems as required by RS endusers for environmental monitoring, risk prevention, and resource management. To verify the performance implementation of the proposed parallel framework, the processing steps are developed and specifically tested on graphic processing units (GPU, achieving considerable speedups compared to the serial version of the same techniques implemented in C language.

  8. Integrating Symbolic and Statistical Methods for Testing Intelligent Systems Applications to Machine Learning and Computer Vision

    Energy Technology Data Exchange (ETDEWEB)

    Jha, Sumit Kumar [University of Central Florida, Orlando; Pullum, Laura L [ORNL; Ramanathan, Arvind [ORNL

    2016-01-01

    Embedded intelligent systems ranging from tiny im- plantable biomedical devices to large swarms of autonomous un- manned aerial systems are becoming pervasive in our daily lives. While we depend on the flawless functioning of such intelligent systems, and often take their behavioral correctness and safety for granted, it is notoriously difficult to generate test cases that expose subtle errors in the implementations of machine learning algorithms. Hence, the validation of intelligent systems is usually achieved by studying their behavior on representative data sets, using methods such as cross-validation and bootstrapping.In this paper, we present a new testing methodology for studying the correctness of intelligent systems. Our approach uses symbolic decision procedures coupled with statistical hypothesis testing to. We also use our algorithm to analyze the robustness of a human detection algorithm built using the OpenCV open-source computer vision library. We show that the human detection implementation can fail to detect humans in perturbed video frames even when the perturbations are so small that the corresponding frames look identical to the naked eye.

  9. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  10. Combination of artificial intelligence and procedural language programs in a computer application system supporting nuclear reactor operations

    International Nuclear Information System (INIS)

    Town, G.G.; Stratton, R.C.

    1985-01-01

    A computer application system is described which provides nuclear reactor power plant operators with an improved decision support system. This system combines traditional computer applications such as graphics display with artificial intelligence methodologies such as reasoning and diagnosis so as to improve plant operability. This paper discusses the issues, and a solution, involved with the system integration of applications developed using traditional and artificial intelligence languages

  11. Combination of artificial intelligence and procedural language programs in a computer application system supporting nuclear reactor operations

    International Nuclear Information System (INIS)

    Stratton, R.C.; Town, G.G.

    1985-01-01

    A computer application system is described which provides nuclear reactor power plant operators with an improved decision support system. This system combines traditional computer applications such as graphics display with artifical intelligence methodologies such as reasoning and diagnosis so as to improve plant operability. This paper discusses the issues, and a solution, involved with the system integration of applications developed using traditional and artificial intelligence languages

  12. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  13. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  14. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    Science.gov (United States)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  15. Data driven model generation based on computational intelligence

    Science.gov (United States)

    Gemmar, Peter; Gronz, Oliver; Faust, Christophe; Casper, Markus

    2010-05-01

    The simulation of discharges at a local gauge or the modeling of large scale river catchments are effectively involved in estimation and decision tasks of hydrological research and practical applications like flood prediction or water resource management. However, modeling such processes using analytical or conceptual approaches is made difficult by both complexity of process relations and heterogeneity of processes. It was shown manifold that unknown or assumed process relations can principally be described by computational methods, and that system models can automatically be derived from observed behavior or measured process data. This study describes the development of hydrological process models using computational methods including Fuzzy logic and artificial neural networks (ANN) in a comprehensive and automated manner. Methods We consider a closed concept for data driven development of hydrological models based on measured (experimental) data. The concept is centered on a Fuzzy system using rules of Takagi-Sugeno-Kang type which formulate the input-output relation in a generic structure like Ri : IFq(t) = lowAND...THENq(t+Δt) = ai0 +ai1q(t)+ai2p(t-Δti1)+ai3p(t+Δti2)+.... The rule's premise part (IF) describes process states involving available process information, e.g. actual outlet q(t) is low where low is one of several Fuzzy sets defined over variable q(t). The rule's conclusion (THEN) estimates expected outlet q(t + Δt) by a linear function over selected system variables, e.g. actual outlet q(t), previous and/or forecasted precipitation p(t ?Δtik). In case of river catchment modeling we use head gauges, tributary and upriver gauges in the conclusion part as well. In addition, we consider temperature and temporal (season) information in the premise part. By creating a set of rules R = {Ri|(i = 1,...,N)} the space of process states can be covered as concise as necessary. Model adaptation is achieved by finding on optimal set A = (aij) of conclusion

  16. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  17. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    Science.gov (United States)

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…

  18. On Computational Fluid Dynamics Tools in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    engineering computational fluid dynamics (CFD) simulation program ANSYS CFX and a CFD based representative program RealFlow are investigated. These two programs represent two types of CFD based tools available for use during phases of an architectural design process. However, as outlined in two case studies...

  19. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  20. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  1. Computational intelligence for the Balanced Scorecard: studying performance trends of hemodialysis clinics.

    Science.gov (United States)

    Cattinelli, Isabella; Bolzoni, Elena; Chermisi, Milena; Bellocchio, Francesco; Barbieri, Carlo; Mari, Flavio; Amato, Claudia; Menzer, Marcus; Stopper, Andrea; Gatti, Emanuele

    2013-07-01

    The Balanced Scorecard (BSC) is a general, widely employed instrument for enterprise performance monitoring based on the periodic assessment of strategic Key Performance Indicators that are scored against preset targets. The BSC is currently employed as an effective management support tool within Fresenius Medical Care (FME) and is routinely analyzed via standard statistical methods. More recently, the application of computational intelligence techniques (namely, self-organizing maps) to BSC data has been proposed as a way to enhance the quantity and quality of information that can be extracted from it. In this work, additional methods are presented to analyze the evolution of clinic performance over time. Performance evolution is studied at the single-clinic level by computing two complementary indexes that measure the proportion of time spent within performance clusters and improving/worsening trends. Self-organizing maps are used in conjunction with these indexes to identify the specific drivers of the observed performance. The performance evolution for groups of clinics is modeled under a probabilistic framework by resorting to Markov chain properties. These allow a study of the probability of transitioning between performance clusters as time progresses for the identification of the performance level that is expected to become dominant over time. We show the potential of the proposed methods through illustrative results derived from the analysis of BSC data of 109 FME clinics in three countries. We were able to identify the performance drivers for specific groups of clinics and to distinguish between countries whose performances are likely to improve from those where a decline in performance might be expected. According to the stationary distribution of the Markov chain, the expected trend is best in Turkey (where the highest performance cluster has the highest probability, P=0.46), followed by Portugal (where the second best performance cluster dominates

  2. Cloud Computing as a Tool for Improving Business Competitiveness

    Directory of Open Access Journals (Sweden)

    Wišniewski Michał

    2014-08-01

    Full Text Available This article organizes knowledge on cloud computing presenting the classification of deployment models, characteristics and service models. The author, looking at the problem from the entrepreneur’s perspective, draws attention to the differences in the benefits depending on the cloud computing deployment models and considers an effective way of selection of cloud computing services according to the specificity of organization. Within this work, a thesis statement was considered that in economic terms the cloud computing is not always the best solution for your organization. This raises the question, “What kind of tools should be used to estimate the usefulness of the model cloud computing services in the enterprise?”

  3. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  4. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  5. A long-term risk management tool for electricity markets using swarm intelligence

    International Nuclear Information System (INIS)

    Azevedo, F.; Vale, Z.A.; Khodr, H.M.; Oliveira, P.B. Moura

    2010-01-01

    This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn. (author)

  6. Cellular computational generalized neuron network for frequency situational intelligence in a multi-machine power system.

    Science.gov (United States)

    Wei, Yawei; Venayagamoorthy, Ganesh Kumar

    2017-09-01

    To prevent large interconnected power system from a cascading failure, brownout or even blackout, grid operators require access to faster than real-time information to make appropriate just-in-time control decisions. However, the communication and computational system limitations of currently used supervisory control and data acquisition (SCADA) system can only deliver delayed information. However, the deployment of synchrophasor measurement devices makes it possible to capture and visualize, in near-real-time, grid operational data with extra granularity. In this paper, a cellular computational network (CCN) approach for frequency situational intelligence (FSI) in a power system is presented. The distributed and scalable computing unit of the CCN framework makes it particularly flexible for customization for a particular set of prediction requirements. Two soft-computing algorithms have been implemented in the CCN framework: a cellular generalized neuron network (CCGNN) and a cellular multi-layer perceptron network (CCMLPN), for purposes of providing multi-timescale frequency predictions, ranging from 16.67 ms to 2 s. These two developed CCGNN and CCMLPN systems were then implemented on two different scales of power systems, one of which installed a large photovoltaic plant. A real-time power system simulator at weather station within the Real-Time Power and Intelligent Systems (RTPIS) laboratory at Clemson, SC, was then used to derive typical FSI results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  8. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Directory of Open Access Journals (Sweden)

    Danli Wang

    2014-01-01

    Full Text Available Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  9. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  10. Introducing artificial intelligence into structural optimization programs

    International Nuclear Information System (INIS)

    Jozwiak, S.F.

    1987-01-01

    Artificial Intelligence /AI/ is defined as the branch of the computer science concerned with the study of the ideas that enable computers to be intelligent. The main purpose of the application of AI in engineering is to develop computer programs which function better as tools for engineers and designers. Many computer programs today have properties which make them inconvenient to their final users and the research carried within the field of AI provides tools and techniques so that these restriction can be removed. The continuous progress in computer technology has lead to developing efficient computer systems which can be applied to more than simple solving sets of equations. (orig.)

  11. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  12. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  13. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  14. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  15. Narrative theories as computational models: reader-oriented theory and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, P.

    1983-12-01

    In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.

  16. Solution of Fractional Order System of Bagley-Torvik Equation Using Evolutionary Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Zahoor Raja

    2011-01-01

    Full Text Available A stochastic technique has been developed for the solution of fractional order system represented by Bagley-Torvik equation. The mathematical model of the equation was developed with the help of feed-forward artificial neural networks. The training of the networks was made with evolutionary computational intelligence based on genetic algorithm hybrid with pattern search technique. Designed scheme was successfully applied to different forms of the equation. Results are compared with standard approximate analytic, stochastic numerical solvers and exact solutions.

  17. A review on economic emission dispatch problems using quantum computational intelligence

    Science.gov (United States)

    Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.

    2016-11-01

    Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.

  18. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  19. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  20. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  1. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    Science.gov (United States)

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  2. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    Science.gov (United States)

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  3. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-06-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  4. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  5. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools.

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-07

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  6. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  7. Evaluating the Impact of Business Intelligence Tools on Organizational Performance in Food and Groceries Retail

    Directory of Open Access Journals (Sweden)

    Sailaja Venuturumilli

    2016-01-01

    Full Text Available While retailers are spending a significant portion of its information technology (IT budgets on BI and related technology in order to handle the ever increasing volumes of data, the actual benefits derived from these tools needs to be explored. The study focuses on the organized food and groceries retail, and explores benefits of business intelligence (BI and hypothesis‟s a structural causal relationship among its intrinsic attributes, and impact on organizational performance. A focus group of selected senior marketing employees was used to develop and validate the research model. Based on findings from the literature survey and focus group, a survey instrument was developed to empirically validate the research model. Data collected from senior marketing executives and managers from six organized food and groceries retail was analyzed using exploratory factor analysis, confirmatory factor analysis, and structural equation modeling. Five major categories of BI were identified: (1 access to data quality, (2 improved managerial effectiveness, (3 improved operational effectiveness, (4 improved customer orientation and (5 improved organizational efficiency. From the structural causal relationship analysis, a significant relationship was found between intrinsic attributes and benefits of BI and data quality. The structural equation model also suggests a significant relationship between BI and data quality on organizational performance.

  8. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  9. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  10. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  11. Assessing Speech Intelligibility in Children with Hearing Loss: Toward Revitalizing a Valuable Clinical Tool

    Science.gov (United States)

    Ertmer, David J.

    2011-01-01

    Background: Newborn hearing screening, early intervention programs, and advancements in cochlear implant and hearing aid technology have greatly increased opportunities for children with hearing loss to become intelligible talkers. Optimizing speech intelligibility requires that progress be monitored closely. Although direct assessment of…

  12. A Comparative Analysis of the Use of Competitive Intelligence Tools in a Multinational Corporation

    Science.gov (United States)

    Breese-Vitelli, Jennifer

    2011-01-01

    With the growth of the global economy, organizations large and small are increasingly recognizing that competitive intelligence (CI) is essential to compete in industry. Competitive intelligence is used to gain an advantage in commerce and is useful for analyzing a company's strategic industry position. To remain current and profitable,…

  13. Laser Fluence Recognition Using Computationally Intelligent Pulsed Photoacoustics Within the Trace Gases Analysis

    Science.gov (United States)

    Lukić, M.; Ćojbašić, Ž.; Rabasović, M. D.; Markushev, D. D.; Todorović, D. M.

    2017-11-01

    In this paper, the possibilities of computational intelligence applications for trace gas monitoring are discussed. For this, pulsed infrared photoacoustics is used to investigate SF6-Ar mixtures in a multiphoton regime, assisted by artificial neural networks. Feedforward multilayer perceptron networks are applied in order to recognize both the spatial characteristics of the laser beam and the values of laser fluence Φ from the given photoacoustic signal and prevent changes. Neural networks are trained in an offline batch training regime to simultaneously estimate four parameters from theoretical or experimental photoacoustic signals: the laser beam spatial profile R(r), vibrational-to-translational relaxation time τ _{V-T} , distance from the laser beam to the absorption molecules in the photoacoustic cell r* and laser fluence Φ . The results presented in this paper show that neural networks can estimate an unknown laser beam spatial profile and the parameters of photoacoustic signals in real time and with high precision. Real-time operation, high accuracy and the possibility of application for higher intensities of radiation for a wide range of laser fluencies are factors that classify the computational intelligence approach as efficient and powerful for the in situ measurement of atmospheric pollutants.

  14. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  15. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  16. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  17. Natural language processing tools for computer assisted language learning

    Directory of Open Access Journals (Sweden)

    Vandeventer Faltin, Anne

    2003-01-01

    Full Text Available This paper illustrates the usefulness of natural language processing (NLP tools for computer assisted language learning (CALL through the presentation of three NLP tools integrated within a CALL software for French. These tools are (i a sentence structure viewer; (ii an error diagnosis system; and (iii a conjugation tool. The sentence structure viewer helps language learners grasp the structure of a sentence, by providing lexical and grammatical information. This information is derived from a deep syntactic analysis. Two different outputs are presented. The error diagnosis system is composed of a spell checker, a grammar checker, and a coherence checker. The spell checker makes use of alpha-codes, phonological reinterpretation, and some ad hoc rules to provide correction proposals. The grammar checker employs constraint relaxation and phonological reinterpretation as diagnosis techniques. The coherence checker compares the underlying "semantic" structures of a stored answer and of the learners' input to detect semantic discrepancies. The conjugation tool is a resource with enhanced capabilities when put on an electronic format, enabling searches from inflected and ambiguous verb forms.

  18. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  19. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  20. Computers and the internet: tools for youth empowerment.

    Science.gov (United States)

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  1. The International Conference on Intelligent Biology and Medicine (ICIBM) 2016: from big data to big analytical tools.

    Science.gov (United States)

    Liu, Zhandong; Zheng, W Jim; Allen, Genevera I; Liu, Yin; Ruan, Jianhua; Zhao, Zhongming

    2017-10-03

    The 2016 International Conference on Intelligent Biology and Medicine (ICIBM 2016) was held on December 8-10, 2016 in Houston, Texas, USA. ICIBM included eight scientific sessions, four tutorials, one poster session, four highlighted talks and four keynotes that covered topics on 3D genomics structural analysis, next generation sequencing (NGS) analysis, computational drug discovery, medical informatics, cancer genomics, and systems biology. Here, we present a summary of the nine research articles selected from ICIBM 2016 program for publishing in BMC Bioinformatics.

  2. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  3. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    International Nuclear Information System (INIS)

    Hummel, J. R.; Silver, R. B.

    2002-01-01

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  4. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  5. Medical applications of artificial intelligence

    CERN Document Server

    Agah, Arvin

    2013-01-01

    Enhanced, more reliable, and better understood than in the past, artificial intelligence (AI) systems can make providing healthcare more accurate, affordable, accessible, consistent, and efficient. However, AI technologies have not been as well integrated into medicine as predicted. In order to succeed, medical and computational scientists must develop hybrid systems that can effectively and efficiently integrate the experience of medical care professionals with capabilities of AI systems. After providing a general overview of artificial intelligence concepts, tools, and techniques, Medical Ap

  6. An Interactive Computer Tool for Teaching About Desalination and Managing Water Demand in the US

    Science.gov (United States)

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    This paper presents an interactive tool to geospatially and temporally analyze desalination developments and trends in the US in the time span 1950-2013, its current contribution to satisfying water demands and its future potentials. The computer tool is open access and can be used by any user with Internet connection, thus facilitating interactive learning about water resources. The tool can also be used by stakeholders and policy makers for decision-making support and with designing sustainable water management strategies. Desalination technology has been acknowledged as a solution to a sustainable water demand management stemming from many sectors, including municipalities, industry, agriculture, power generation, and other users. Desalination has been applied successfully in the US and many countries around the world since 1950s. As of 2013, around 1,336 desalination plants were operating in the US alone, with a daily production capacity of 2 BGD (billion gallons per day) (GWI, 2013). Despite a steady increase in the number of new desalination plants and growing production capacity, in many regions, the costs of desalination are still prohibitive. At the same time, the technology offers a tremendous potential for `enormous supply expansion that exceeds all likely demands' (Chowdhury et al., 2013). The model and tool are based on data from Global Water Intelligence (GWI, 2013). The analysis shows that more than 90% of all the plants in the US are small-scale plants with the capacity below 4.31 MGD. Most of the plants (and especially larger plants) are located on the US East Coast, as well as in California, Texas, Oklahoma, and Florida. The models and the tool provide information about economic feasibility of potential new desalination plants based on the access to feed water, energy sources, water demand, and experiences of other plants in that region.

  7. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  8. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  9. 3D data processing with advanced computer graphics tools

    Science.gov (United States)

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  10. Computational Intelligence-Assisted Understanding of Nature-Inspired Superhydrophobic Behavior.

    Science.gov (United States)

    Zhang, Xia; Ding, Bei; Cheng, Ran; Dixon, Sebastian C; Lu, Yao

    2018-01-01

    In recent years, state-of-the-art computational modeling of physical and chemical systems has shown itself to be an invaluable resource in the prediction of the properties and behavior of functional materials. However, construction of a useful computational model for novel systems in both academic and industrial contexts often requires a great depth of physicochemical theory and/or a wealth of empirical data, and a shortage in the availability of either frustrates the modeling process. In this work, computational intelligence is instead used, including artificial neural networks and evolutionary computation, to enhance our understanding of nature-inspired superhydrophobic behavior. The relationships between experimental parameters (water droplet volume, weight percentage of nanoparticles used in the synthesis of the polymer composite, and distance separating the superhydrophobic surface and the pendant water droplet in adhesive force measurements) and multiple objectives (water droplet contact angle, sliding angle, and adhesive force) are built and weighted. The obtained optimal parameters are consistent with the experimental observations. This new approach to materials modeling has great potential to be applied more generally to aid design, fabrication, and optimization for myriad functional materials.

  11. Early detection and identification of anomalies in chemical regime based on computational intelligence techniques

    International Nuclear Information System (INIS)

    Figedy, Stefan; Smiesko, Ivan

    2012-01-01

    This article provides brief information about the fundamental features of a newly-developed diagnostic system for early detection and identification of anomalies being generated in water chemistry regime of the primary and secondary circuit of the VVER-440 reactor. This system, which is called SACHER (System of Analysis of CHEmical Regime), was installed within the major modernization project at the NPP-V2 Bohunice in the Slovak Republic. The SACHER system has been fully developed on MATLAB environment. It is based on computational intelligence techniques and inserts various elements of intelligent data processing modules for clustering, diagnosing, future prediction, signal validation, etc, into the overall chemical information system. The application of SACHER would essentially assist chemists to identify the current situation regarding anomalies being generated in the primary and secondary circuit water chemistry. This system is to be used for diagnostics and data handling, however it is not intended to fully replace the presence of experienced chemists to decide upon corrective actions. (author)

  12. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  13. A review of Computational Intelligence techniques in coral reef-related applications

    NARCIS (Netherlands)

    Salcedo-Sanz, S.; Cuadra, L.; Vermeij, M.J.A.

    Studies on coral reefs increasingly combine aspects of science and technology to understand the complex dynamics and processes that shape these benthic ecosystems. Recently, the use of advanced computational algorithms has entered coral reef science as new powerful tools that help solve complex

  14. Evaluation of trade influence on economic growth rate by computational intelligence approach

    Science.gov (United States)

    Sokolov-Mladenović, Svetlana; Milovančević, Milos; Mladenović, Igor

    2017-01-01

    In this study was analyzed the influence of trade parameters on the economic growth forecasting accuracy. Computational intelligence method was used for the analyzing since the method can handle highly nonlinear data. It is known that the economic growth could be modeled based on the different trade parameters. In this study five input parameters were considered. These input parameters were: trade in services, exports of goods and services, imports of goods and services, trade and merchandise trade. All these parameters were calculated as added percentages in gross domestic product (GDP). The main goal was to select which parameters are the most impactful on the economic growth percentage. GDP was used as economic growth indicator. Results show that the imports of goods and services has the highest influence on the economic growth forecasting accuracy.

  15. Intelligent Continuous Double Auction method For Service Allocation in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Nima Farajian

    2013-10-01

    Full Text Available Market-oriented approach is an effective method for resource management because of its regulation of supply and demand and is suitable for cloud environment where the computing resources, either software or hardware, are virtualized and allocated as services from providers to users. In this paper a continuous double auction method for efficient cloud service allocation is presented in which i enables consumers to order various resources (services for workflows and coallocation, ii consumers and providers make bid and request prices based on deadline and workload time and in addition providers can tradeoff between utilization time and price of bids, iii auctioneers can intelligently find optimum matching by sharing and merging resources which result more trades. Experimental results show that proposed method is efficient in terms of successful allocation rate and resource utilization.

  16. Solving Multi-Pollutant Emission Dispatch Problem Using Computational Intelligence Technique

    Directory of Open Access Journals (Sweden)

    Nur Azzammudin Rahmat

    2016-06-01

    Full Text Available Economic dispatch is a crucial process conducted by the utilities to correctly determine the satisfying amount of power to be generated and distributed to the consumers. During the process, the utilities also consider pollutant emission as the consequences of fossil-fuel consumption. Fossil-fuel includes petroleum, coal, and natural gas; each has its unique chemical composition of pollutants i.e. sulphur oxides (SOX, nitrogen oxides (NOX and carbon oxides (COX. This paper presents multi-pollutant emission dispatch problem using computational intelligence technique. In this study, a novel emission dispatch technique is formulated to determine the amount of the pollutant level. It utilizes a pre-developed optimization technique termed as differential evolution immunized ant colony optimization (DEIANT for the emission dispatch problem. The optimization results indicated high level of COX level, regardless of any type of fossil fuel consumed.

  17. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  18. Recent developments in spatial analysis spatial statistics, behavioural modelling, and computational intelligence

    CERN Document Server

    Getis, Arthur

    1997-01-01

    In recent years, spatial analysis has become an increasingly active field, as evidenced by the establishment of educational and research programs at many universities. Its popularity is due mainly to new technologies and the development of spatial data infrastructures. This book illustrates some recent developments in spatial analysis, behavioural modelling, and computational intelligence. World renown spatial analysts explain and demonstrate their new and insightful models and methods. The applications are in areas of societal interest such as the spread of infectious diseases, migration behaviour, and retail and agricultural location strategies. In addition, there is emphasis on the uses of new technologoies for the analysis of spatial data through the application of neural network concepts.

  19. BUSINESS INTELLIGENCE

    OpenAIRE

    Bogdan Mohor Dumitrita

    2011-01-01

    The purpose of this work is to present business intelligence systems. These systems can be extremely complex and important in modern market competition. Its effectiveness also reflects in price, so we have to exlore their financial potential before investment. The systems have 20 years long history and during that time many of such tools have been developed, but they are rarely still in use. Business intelligence system consists of three main areas: Data Warehouse, ETL tools and tools f...

  20. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Science.gov (United States)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  1. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    Science.gov (United States)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  2. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  3. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  4. Computational intelligence-based polymerase chain reaction primer selection based on a novel teaching-learning-based optimisation.

    Science.gov (United States)

    Cheng, Yu-Huei

    2014-12-01

    Specific primers play an important role in polymerase chain reaction (PCR) experiments, and therefore it is essential to find specific primers of outstanding quality. Unfortunately, many PCR constraints must be simultaneously inspected which makes specific primer selection difficult and time-consuming. This paper introduces a novel computational intelligence-based method, Teaching-Learning-Based Optimisation, to select the specific and feasible primers. The specified PCR product lengths of 150-300 bp and 500-800 bp with three melting temperature formulae of Wallace's formula, Bolton and McCarthy's formula and SantaLucia's formula were performed. The authors calculate optimal frequency to estimate the quality of primer selection based on a total of 500 runs for 50 random nucleotide sequences of 'Homo species' retrieved from the National Center for Biotechnology Information. The method was then fairly compared with the genetic algorithm (GA) and memetic algorithm (MA) for primer selection in the literature. The results show that the method easily found suitable primers corresponding with the setting primer constraints and had preferable performance than the GA and the MA. Furthermore, the method was also compared with the common method Primer3 according to their method type, primers presentation, parameters setting, speed and memory usage. In conclusion, it is an interesting primer selection method and a valuable tool for automatic high-throughput analysis. In the future, the usage of the primers in the wet lab needs to be validated carefully to increase the reliability of the method.

  5. The application of a Business Intelligence tool for strategic planning in a higher education institution: a case study of the University of the Witwatersrand

    Directory of Open Access Journals (Sweden)

    Vincent Nyalungu

    2011-07-01

    Full Text Available This article presents a discussion on the importance of business intelligence (BI and the role that a specific BI tool, Business Intelligence Enterprise Edition, plays in the strategic decision-making processes in an organisation. The University of the Witwatersrand, often referred to as Wits, was used as a case study. The main objective of a business intelligence tool is to improve the quality and timeliness of the input of data to the organisational decision-making process. The quality of the data, which is an organisational asset, is therefore of the utmost importance. Approaches for the identification of business intelligence from corporate information and knowledge management were also assessed. A questionnaire was administered among key informants within the university in order to address some of the pertinent issues at higher education institutions. In addition, the role of a data warehouse within the business intelligence framework was presented. The paper itself covers a wide range of disciplines from information technology, knowledge management to decision sciences. The article also presents a proposed framework to be used in line with the best practices in the implementation of business intelligence solutions. Keywords: Business Intelligence (BI, Business Intelligence Enterprise Edition (BIEE, Data Warehouse, Strategic Decision Making, Strategic Planning, Higher Education Institutions and Knowledge Management. Disciplines: Information Technology, Knowledge Management, Management Sciences, Decision Sciences & Management

  6. A comprehensive review of the use of computational intelligence methods in mineral exploration

    Directory of Open Access Journals (Sweden)

    Habibollah Bazdar

    2017-11-01

    Full Text Available Introduction Mineral exploration is a process by which it is decided whether or not continuing explorations at the end of each stage t will be cost-effective or not. This decision is dependent upon many factors including technical factors, economic, social and other related factors. All new methods used in mineral exploration are meant to make this decision making more simplified. In recent years, advanced computational intelligence methods for modeling along with many other disciplines of science, including the science of mineral exploration have been used. Although the results of the application of these methods show a good performance, it is essential to determine the mineral potential in terms of geology, mineralogy, petrology and other factors for a final decision. The purpose of this paper is to provide a comprehensive set of mineral exploration research and different applications of computational intelligence techniques in this respect during the last decades. Materials and methods Artificial neural network and its application in mineral exploration Artificial neural network (ANN is a series of communications between the units or nodes that try to function like neurons of the human brain (Jorjani et al., 2008. The network processing capability of communication between the units and the weights connection originates or comes from learning or are predetermined (Monjezi and Dehghani, 2008. The ANN method has been applied in different branches of mining exploration in the last decades (Brown et al., 2000; Leite and de Souza Filho, 2009; Porwal et al., 2003. Support vector machines (SVM and its application in mineral exploration SVM uses a set of examples with known class of information to build a linear hyperplane separating samples of different classes. This initial dataset is known as a training set and every sample within it is characterized by features upon which the classification is based (Smirnoff et al., 2008. The SVM classifier is a

  7. Daddy’s Car: Artificial Intelligence as a Creative Tool for Copyright

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Díaz Limón

    2016-12-01

    Full Text Available This year on September 19th, Sony CSL, a software developer company, announced to the world, the creation of the first musical work whose ownership belongs to Artificial Intelligence. This paper analyzes the legal consequences of such a statement, and it’s conceptual and legal limits within the Copyright Universe (with fundament on International Treaties; in order to assess whether we are in presence of new legal-authorial figure that invite us to think over the subjects of protection in our laws or whether the applicable normativity may resolve these hypotheses in favor Artificial Intelligence, instead of juridical persons.

  8. Acquaintance to Artificial Neural Networks and use of artificial intelligence as a diagnostic tool for tuberculosis: A review.

    Science.gov (United States)

    Dande, Payal; Samant, Purva

    2018-01-01

    Tuberculosis [TB] has afflicted numerous nations in the world. As per a report by the World Health Organization [WHO], an estimated 1.4 million TB deaths in 2015 and an additional 0.4 million deaths resulting from TB disease among people living with HIV, were observed. Most of the TB deaths can be prevented if it is detected at an early stage. The existing processes of diagnosis like blood tests or sputum tests are not only tedious but also take a long time for analysis and cannot differentiate between different drug resistant stages of TB. The need to find newer prompt methods for disease detection has been aided by the latest Artificial Intelligence [AI] tools. Artificial Neural Network [ANN] is one of the important tools that is being used widely in diagnosis and evaluation of medical conditions. This review aims at providing brief introduction to various AI tools that are used in TB detection and gives a detailed description about the utilization of ANN as an efficient diagnostic technique. The paper also provides a critical assessment of ANN and the existing techniques for their diagnosis of TB. Researchers and Practitioners in the field are looking forward to use ANN and other upcoming AI tools such as Fuzzy-logic, genetic algorithms and artificial intelligence simulation as a promising current and future technology tools towards tackling the global menace of Tuberculosis. Latest advancements in the diagnostic field include the combined use of ANN with various other AI tools like the Fuzzy-logic, which has led to an increase in the efficacy and specificity of the diagnostic techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Atomdroid: a computational chemistry tool for mobile platforms.

    Science.gov (United States)

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  10. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura; Kozlov, Sergey M.; Cavallo, Luigi

    2018-01-01

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  11. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  12. Translation Memory and Computer Assisted Translation Tool for Medieval Texts

    Directory of Open Access Journals (Sweden)

    Törcsvári Attila

    2013-05-01

    Full Text Available Translation memories (TMs, as part of Computer Assisted Translation (CAT tools, support translators reusing portions of formerly translated text. Fencing books are good candidates for using TMs due to the high number of repeated terms. Medieval texts suffer a number of drawbacks that make hard even “simple” rewording to the modern version of the same language. The analyzed difficulties are: lack of systematic spelling, unusual word orders and typos in the original. A hypothesis is made and verified that even simple modernization increases legibility and it is feasible, also it is worthwhile to apply translation memories due to the numerous and even extremely long repeated terms. Therefore, methods and algorithms are presented 1. for automated transcription of medieval texts (when a limited training set is available, and 2. collection of repeated patterns. The efficiency of the algorithms is analyzed for recall and precision.

  13. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura

    2018-05-08

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  14. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  15. Intelligent techniques applied in identifying fraudsters industrial consumers of electricity; Tecnicas inteligentes aplicadas na identificacao de consumidores industriais fraudadores de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Caio C.O.; Souza, Andre N. de; Pereira, Lucas I.; Gastaldello, Danilo S. [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. de Engenharia Eletrica], Emails: caioramos@gmail.com, andrejau@feb.unesp.br, ra510611@feb.unesp.br, danilosg@feb.unesp.br; Zago, Maria G. [Universidade de Sao Paulo (EP/USP), SP (Brazil) Escola Politecnica], Email: mgzago@usp.br; Papa, Joao P. [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. da Computacao], Email: papa.joaopaulo@gmail.com

    2009-07-01

    The development of a computational intelligent tools based on neural network to identify commercial losses or fraud (theft energy), considering information from a database electric utility, is presented.

  16. TRAC, a collaborative computer tool for tracer-test interpretation

    Directory of Open Access Journals (Sweden)

    Fécamp C.

    2013-05-01

    Full Text Available Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being. Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr.

  17. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  18. Improving Strategic Planning and Implementation in Universities through Competitive Intelligence Tools: A Means to Gaining Relevance

    Science.gov (United States)

    Hughes, Stephanie; White, Rebecca J.

    2005-01-01

    Institutions of higher education can use competitive intelligence (CI) techniques to become more relevant to their communities. In Stepping Forward as Stewards of Place the American Association of State Colleges and Universities (AASCU) provides a model for public engagement that emphasizes internal strategic planning, implementation, and…

  19. Implementing dashboards as a business intelligence tool in the forest inventory and analysis program

    Science.gov (United States)

    Scott A. Pugh; Randall S. Morin; Barbara A. Johnson

    2015-01-01

    Today is the era of “big data” where businesses have access to enormous amounts of often complex and sometimes unwieldy data. Businesses are using business intelligence (BI) systems to transform this data into useful information for management decisions. BI systems integrate applications, processes, data, and people to deliver prompt and robust analyses. A number of...

  20. An Artificial Intelligence Tutor: A Supplementary Tool for Teaching and Practicing Braille

    Science.gov (United States)

    McCarthy, Tessa; Rosenblum, L. Penny; Johnson, Benny G.; Dittel, Jeffrey; Kearns, Devin M.

    2016-01-01

    Introduction: This study evaluated the usability and effectiveness of an artificial intelligence Braille Tutor designed to supplement the instruction of students with visual impairments as they learned to write braille contractions. Methods: A mixed-methods design was used, which incorporated a single-subject, adapted alternating treatments design…