WorldWideScience

Sample records for soft computing approaches

  1. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  2. Optimising customer support in contact centres using soft computing approach

    OpenAIRE

    Shah, Satya Ramesh; Roy, Rajkumar; Tiwari, Ashutosh

    2006-01-01

    This paper describes the research and development of a methodology for optimising the customer support in contact centres (CC) using a soft computing approach. The methodology provides the categorisation of customer and customer service advisor (CSA) within CC. Within the current contact centre environment there is a problem of high staff turnover and lack of trained staff at the right place for the right kind of customer. Business needs to assign any available advisor to a ...

  3. Microarray-Based Cancer Prediction Using Soft Computing Approach

    Directory of Open Access Journals (Sweden)

    Xiaosheng Wang

    2009-01-01

    Full Text Available One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  4. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  5. A Soft Computing Approach to Kidney Diseases Evaluation.

    Science.gov (United States)

    Neves, José; Martins, M Rosário; Vilhena, João; Neves, João; Gomes, Sabino; Abelha, António; Machado, José; Vicente, Henrique

    2015-10-01

    Kidney renal failure means that one's kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient's history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the

  6. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  7. PhysioSoft--an approach in applying computer technology in biofeedback procedures.

    Science.gov (United States)

    Havelka, Mladen; Havelka, Juraj; Delimar, Marko

    2009-09-01

    The paper presents description of original biofeedback computer program called PhysioSoft. It has been designed on the basis of the experience in development of biofeedback techniques of interdisciplinary team of experts of the Department of Health Psychology of the University of Applied Health Studies, Faculty of Electrical Engineering and Computing, University of Zagreb, and "Mens Sana", Private Biofeedback Practice in Zagreb. The interest in the possibility of producing direct and voluntary effects on autonomic body functions has gradually proportionately increased with the dynamics of abandoning the Cartesian model of body-mind relationship. The psychosomatic approach and studies carried out in the 50-ies of the 20th century, together with the research about conditioned and operant learning, have proved close inter-dependence between the physical and mental, and also the possibility of training the individual to consciously act on his autonomic physiological functions. The new knowledge has resulted in the development of biofeedback techniques around the 70-ies of the previous century and has been the base of many studies indicating the significance of biofeedback techniques in clinical practice concerned with many symptoms of health disorders. The digitalization of biofeedback instruments and development of user friendly computer software enable the use of biofeedback at individual level as an efficient procedure of a patient's active approach to self care of his own health. As the new user friendly computer software enables extensive accessibility of biofeedback instruments, the authors have designed the PhysioSoft computer program as a contribution to the development and broad use of biofeedback.

  8. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    2006-01-01

    ’s interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable for supporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology...

  9. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Vidal, Rene Victor Valqui

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology...

  10. Evaluating Six Soft Approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Valqui Vidal, René Victor

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology...

  11. Towards Soft Computing Agents

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman; Krušina, Pavel; Petrová, Zuzana

    2000-01-01

    Roč. 10, č. 5 (2000), s. 859-868 ISSN 1210-0552. [SOFSEM 2000 Works hop on Soft Computing. Milovy, 27.11.2000-28.11.2000] R&D Projects: GA ČR GA201/00/1489; GA ČR GA201/99/P057 Institutional research plan: AV0Z1030915 Keywords : hybrid systems * intelligent agents Subject RIV: BA - General Mathematics

  12. Soft computing model on genetic diversity and pathotype differentiation of pathogens: A novel approach

    OpenAIRE

    Hüseyin Gürüler; Musa Peker; Ömür Baysal

    2015-01-01

    Background: Identifying and validating biomarkers' scores of polymorphic bands are important for studies related to the molecular diversity of pathogens. Although these validations provide more relevant results, the experiments are very complex and time-consuming. Besides rapid identification of plant pathogens causing disease, assessing genetic diversity and pathotype formation using automated soft computing methods are advantageous in terms of following genetic variation of pathogens on pla...

  13. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  14. Soft computing based on hierarchical evaluation approach and criteria interdependencies for energy decision-making problems: A case study

    International Nuclear Information System (INIS)

    Gitinavard, Hossein; Mousavi, S. Meysam; Vahdani, Behnam

    2017-01-01

    In numerous real-world energy decision problems, decision makers often encounter complex environments, in which existent imprecise data and uncertain information lead us to make an appropriate decision. In this paper, a new soft computing group decision-making approach is introduced based on novel compromise ranking method and interval-valued hesitant fuzzy sets (IVHFSs) for energy decision-making problems under multiple criteria. In the proposed approach, the assessment information is provided by energy experts or decision makers based on interval-valued hesitant fuzzy elements under incomplete criteria weights. In this respect, a new ranking index is presented respecting to interval-valued hesitant fuzzy Hamming distance measure to prioritize energy candidates, and criteria weights are computed based on an extended maximizing deviation method by considering the preferences experts' judgments about the relative importance of each criterion. Also, a decision making trial and evaluation laboratory (DEMATEL) method is extended under an IVHF-environment to compute the interdependencies between and within the selected criteria in the hierarchical structure. Accordingly, to demonstrate the applicability of the presented approach a case study and a practical example are provided regarding to hierarchical structure and criteria interdependencies relations for renewable energy and energy policy selection problems. Hence, the obtained computational results are compared with a fuzzy decision-making method from the recent literature based on some comparison parameters to show the advantages and constraints of the proposed approach. Finally, a sensitivity analysis is prepared to indicate effects of different criteria weights on ranking results to present the robustness or sensitiveness of the proposed soft computing approach versus the relative importance of criteria. - Highlights: • Introducing a novel interval-valued hesitant fuzzy compromise ranking method.

  15. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  16. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  17. Node Voltage Improvement by Capacitor Placement in Distribution Network : A Soft Computing Approach

    OpenAIRE

    SHWETA SARKAR,; SANDEEP CHAKRAVORTY

    2010-01-01

    This paper deals with a genetic algorithm based approach for determining the optimum placement location of capacitor in radial distribution system which is obtained after optimum reconfiguration. Reduction of total losses in distribution system is very essential to improve the overall efficiency of power delivery. This can be achieved by placing the optimal value of capacitors at proper ocations in radial distribution systems. The proposed methodologyis a genetic approach based algorithm. Th...

  18. Soft computing for business intelligence

    CERN Document Server

    Pérez, Rafael; Cobo, Angel; Marx, Jorge; Valdés, Ariel

    2014-01-01

    The book Soft Computing for Business Intelligence is the remarkable output of a program based on the idea of joint trans-disciplinary research as supported by the Eureka Iberoamerica Network and the University of Oldenburg. It contains twenty-seven papers allocated to three sections: Soft Computing, Business Intelligence and Knowledge Discovery, and Knowledge Management and Decision Making. Although the contents touch different domains they are similar in so far as they follow the BI principle “Observation and Analysis” while keeping a practical oriented theoretical eye on sound methodologies, like Fuzzy Logic, Compensatory Fuzzy Logic (CFL), Rough Sets and other softcomputing elements. The book tears down the traditional focus on business, and extends Business Intelligence techniques in an impressive way to a broad range of fields like medicine, environment, wind farming, social collaboration and interaction, car sharing and sustainability.

  19. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  20. Modeling Academic Performance Evaluation Using Soft Computing Techniques: A Fuzzy Logic Approach

    OpenAIRE

    Ramjeet Singh Yadav; Vijendra Pratap Singh

    2011-01-01

    We have proposed a Fuzzy Expert System (FES) for student academic performance evaluation based on fuzzy logic techniques. A suitable fuzzy inference mechanism and associated rule has been discussed. It introduces the principles behind fuzzy logic and illustrates how these principles could be applied by educators to evaluating student academic performance. Several approaches using fuzzy logic techniques have been proposed to provide a practical method for evaluating student academic performanc...

  1. A Soft Computing Approach to Crack Detection and Impact Source Identification with Field-Programmable Gate Array Implementation

    Directory of Open Access Journals (Sweden)

    Arati M. Dixit

    2013-01-01

    Full Text Available The real-time nondestructive testing (NDT for crack detection and impact source identification (CDISI has attracted the researchers from diverse areas. This is apparent from the current work in the literature. CDISI has usually been performed by visual assessment of waveforms generated by a standard data acquisition system. In this paper we suggest an automation of CDISI for metal armor plates using a soft computing approach by developing a fuzzy inference system to effectively deal with this problem. It is also advantageous to develop a chip that can contribute towards real time CDISI. The objective of this paper is to report on efforts to develop an automated CDISI procedure and to formulate a technique such that the proposed method can be easily implemented on a chip. The CDISI fuzzy inference system is developed using MATLAB’s fuzzy logic toolbox. A VLSI circuit for CDISI is developed on basis of fuzzy logic model using Verilog, a hardware description language (HDL. The Xilinx ISE WebPACK9.1i is used for design, synthesis, implementation, and verification. The CDISI field-programmable gate array (FPGA implementation is done using Xilinx’s Spartan 3 FPGA. SynaptiCAD’s Verilog Simulators—VeriLogger PRO and ModelSim—are used as the software simulation and debug environment.

  2. Upgrading ideas about the concept of Soft Computing

    Directory of Open Access Journals (Sweden)

    Javier Montero

    2010-06-01

    Full Text Available This short note is devoted to introduce the discussion carried out along this special issue on the concept of Soft Computing by key researchers in the field. We shall stress some aspects of the conception and origins of Soft Computing, supported by the scientific relevance of its participants. The contributors will show their own view about a single question, What is Soft Computing?, covering answers from a general historical approach to the role of some specific tools within their expertise. This discussion represents an extremely interesting view about the concept of Soft Computing, its meaning, its related techniques and its relationship with close fields.

  3. Ambient temperature modelling with soft computing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni [Energy, New Technology and Environment Agency (ENEA), Via Anguillarese 301, 00123 Rome (Italy); De Felice, Matteo [Energy, New Technology and Environment Agency (ENEA), Via Anguillarese 301, 00123 Rome (Italy); University of Rome ' ' Roma 3' ' , Dipartimento di Informatica e Automazione (DIA), Via della Vasca Navale 79, 00146 Rome (Italy)

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  4. Fault diagnosis and fault-tolerant control strategies for non-linear systems analytical and soft computing approaches

    CERN Document Server

    Witczak, Marcin

    2014-01-01

      This book presents selected fault diagnosis and fault-tolerant control strategies for non-linear systems in a unified framework. In particular, starting from advanced state estimation strategies up to modern soft computing, the discrete-time description of the system is employed Part I of the book presents original research results regarding state estimation and neural networks for robust fault diagnosis. Part II is devoted to the presentation of integrated fault diagnosis and fault-tolerant systems. It starts with a general fault-tolerant control framework, which is then extended by introducing robustness with respect to various uncertainties. Finally, it is shown how to implement the proposed framework for fuzzy systems described by the well-known Takagi–Sugeno models. This research monograph is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science,as well as mechanical and chemical engineering.

  5. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  6. 4th World Conference on Soft Computing

    CERN Document Server

    Abbasov, Ali; Yager, Ronald; Shahbazova, Shahnaz; Reformat, Marek

    2016-01-01

    This book reports on advanced theories and cutting-edge applications in the field of soft computing. The individual chapters, written by leading researchers, are based on contributions presented during the 4th World Conference on Soft Computing, held May 25-27, 2014, in Berkeley. The book covers a wealth of key topics in soft computing, focusing on both fundamental aspects and applications. The former include fuzzy mathematics, type-2 fuzzy sets, evolutionary-based optimization, aggregation and neural networks, while the latter include soft computing in data analysis, image processing, decision-making, classification, series prediction, economics, control, and modeling. By providing readers with a timely, authoritative view on the field, and by discussing thought-provoking developments and challenges, the book will foster new research directions in the diverse areas of soft computing. .

  7. New Concepts and Applications in Soft Computing

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária

    2013-01-01

                  The book provides a sample of research on the innovative theory and applications of soft computing paradigms.             The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. ...

  8. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  9. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  10. International Conference on Soft Computing Systems

    CERN Document Server

    Panigrahi, Bijaya

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in International Conference on Soft Computing Systems (ICSCS 2015) held at Noorul Islam Centre for Higher Education, Chennai, India. These research papers provide the latest developments in the emerging areas of Soft Computing in Engineering and Technology. The book is organized in two volumes and discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  11. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  12. Phoneme-based speech segmentation using hybrid soft computing framework

    CERN Document Server

    Sarma, Mousmita

    2014-01-01

    The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.

  13. Soft and hard computing approaches for real-time prediction of currents in a tide-dominated coastal area

    Digital Repository Service at National Institute of Oceanography (India)

    Charhate, S.B.; Deo, M.C.; SanilKumar, V.

    such as the rootreference [30]. As the use of GP in the problem domain is new, more details of the basic concepts mean square (r.m.s.) error. are given below. Step 3. Select individuals or parents (normally prob- In GP, a random population of individuals abilistically..., through a tournament consisting of (equations or computer programs) is created, the fit- comparison of two parents at a time and thereafter ness of individuals is evaluated, and then the parents short-listing the winner for further competition...

  14. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  15. 22nd International Conference on Soft Computing

    CERN Document Server

    2017-01-01

    This proceeding book contains a collection of selected accepted papers of the Mendel conference held in Brno, Czech Republic in June 2016. The proceedings book contains three chapters which present recent advances in soft computing including intelligent image processing. The Mendel conference was established in 1995 and is named after the scientist and Augustinian priest Gregor J. Mendel who discovered the famous Laws of Heredity. The main aim of the conference is to create a regular possibility for students, academics and researchers to exchange ideas and novel research methods on a yearly basis.

  16. Soft Computing Applications : Proceedings of the 5th International Workshop Soft Computing Applications

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária; Dombi, Joszef; Jain, Lakhmi

    2013-01-01

                    This volume contains the Proceedings of the 5thInternational Workshop on Soft Computing Applications (SOFA 2012).                                The book covers a broad spectrum of soft computing techniques, theoretical and practical applications employing knowledge and intelligence to find solutions for world industrial, economic and medical problems. The combination of such intelligent systems tools and a large number of applications introduce a need for a synergy of scientific and technological disciplines in order to show the great potential of Soft Computing in all domains.                   The conference papers included in these proceedings, published post conference, were grouped into the following area of research: ·         Soft Computing and Fusion Algorithms in Biometrics, ·         Fuzzy Theory, Control andApplications, ·         Modelling and Control Applications, ·         Steps towa...

  17. Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Abraham, Ajith; Corchado, Emilio; 7th International Conference, SOCO’12

    2013-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at SOCO 2012, held in the beautiful and historic city of Ostrava (Czech Republic), in September 2012.   Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena.   After a through peer-review process, the SOCO 2012 International Program Committee selected 75 papers which are published in these conference proceedings, and represents an acceptance rate of 38%. In this relevant edition a special emphasis was put on the organization of special sessions. Three special sessions were organized related to relevant topics as: Soft computing models for Control Theory & Applications in Electrical Engineering, Soft computing models for biomedical signals and data processing and Advanced Soft Computing Methods in Computer Vision and Data Processing.   The selecti...

  18. Soft matter approaches to food structuring

    NARCIS (Netherlands)

    Sman, van der R.G.M.

    2012-01-01

    We give an overview of the many opportunities that arise from approaching food structuring from the perspective of soft matter physics. This branch of physics employs concepts that build upon the seminal work of van der Waals, such as free volume, the mean field, and effective temperatures. All

  19. 2nd International Conference on Soft Computing and Data Mining

    CERN Document Server

    Ghazali, Rozaida; Nawi, Nazri; Deris, Mustafa

    2017-01-01

    This book provides a comprehensive introduction and practical look at the concepts and techniques readers need to get the most out of their data in real-world, large-scale data mining projects. It also guides readers through the data-analytic thinking necessary for extracting useful knowledge and business value from the data. The book is based on the Soft Computing and Data Mining (SCDM-16) conference, which was held in Bandung, Indonesia on August 18th–20th 2016 to discuss the state of the art in soft computing techniques, and offer participants sufficient knowledge to tackle a wide range of complex systems. The scope of the conference is reflected in the book, which presents a balance of soft computing techniques and data mining approaches. The two constituents are introduced to the reader systematically and brought together using different combinations of applications and practices. It offers engineers, data analysts, practitioners, scientists and managers the insights into the concepts, tools and techni...

  20. 6th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Bansal, Jagdish; Das, Kedar; Lal, Arvind; Garg, Harish; Nagar, Atulya; Pant, Millie

    2017-01-01

    This two-volume book gathers the proceedings of the Sixth International Conference on Soft Computing for Problem Solving (SocProS 2016), offering a collection of research papers presented during the conference at Thapar University, Patiala, India. Providing a veritable treasure trove for scientists and researchers working in the field of soft computing, it highlights the latest developments in the broad area of “Computational Intelligence” and explores both theoretical and practical aspects using fuzzy logic, artificial neural networks, evolutionary algorithms, swarm intelligence, soft computing, computational intelligence, etc.

  1. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  2. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  3. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  4. Soft Computing Techniques in Vision Science

    CERN Document Server

    Yang, Yeon-Mo

    2012-01-01

    This Special Edited Volume is a unique approach towards Computational solution for the upcoming field of study called Vision Science. From a scientific firmament Optics, Ophthalmology, and Optical Science has surpassed an Odyssey of optimizing configurations of Optical systems, Surveillance Cameras and other Nano optical devices with the metaphor of Nano Science and Technology. Still these systems are falling short of its computational aspect to achieve the pinnacle of human vision system. In this edited volume much attention has been given to address the coupling issues Computational Science and Vision Studies.  It is a comprehensive collection of research works addressing various related areas of Vision Science like Visual Perception and Visual system, Cognitive Psychology, Neuroscience, Psychophysics and Ophthalmology, linguistic relativity, color vision etc. This issue carries some latest developments in the form of research articles and presentations. The volume is rich of contents with technical tools ...

  5. Understanding soft condensed matter via modeling and computation

    CERN Document Server

    Shi, An-Chang

    2011-01-01

    All living organisms consist of soft matter. For this reason alone, it is important to be able to understand and predict the structural and dynamical properties of soft materials such as polymers, surfactants, colloids, granular matter and liquids crystals. To achieve a better understanding of soft matter, three different approaches have to be integrated: experiment, theory and simulation. This book focuses on the third approach - but always in the context of the other two.

  6. Developing a multimodal biometric authentication system using soft computing methods.

    Science.gov (United States)

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  7. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  8. Fuzzy systems and soft computing in nuclear engineering

    International Nuclear Information System (INIS)

    Ruan, D.

    2000-01-01

    This book is an organized edited collection of twenty-one contributed chapters covering nuclear engineering applications of fuzzy systems, neural networks, genetic algorithms and other soft computing techniques. All chapters are either updated review or original contributions by leading researchers written exclusively for this volume. The volume highlights the advantages of applying fuzzy systems and soft computing in nuclear engineering, which can be viewed as complementary to traditional methods. As a result, fuzzy sets and soft computing provide a powerful tool for solving intricate problems pertaining in nuclear engineering. Each chapter of the book is self-contained and also indicates the future research direction on this topic of applications of fuzzy systems and soft computing in nuclear engineering. (orig.)

  9. Soft computing trends in nuclear energy system

    International Nuclear Information System (INIS)

    Paramasivan, B.

    2012-01-01

    In spite of so many advancements in the power and energy sector over the last two decades, its survival to cater quality power with due consideration for planning, coordination, marketing, safety, stability, optimality and reliability is still believed to remain critical. Though it appears simple from the outside, yet the internal structure of large scale power systems is so complex that event management and decision making requires a formidable preliminary preparation, which gets still worsened in the presence of uncertainties and contingencies. These aspects have attracted several researchers to carryout continued research in this field and their valued contributions have been significantly helping the newcomers in understanding the evolutionary growth in this sector, starting from phenomena, tools, methodologies to strategies so as to ensure smooth, stable, safe, reliable and economic operation. The usage of soft computing would accelerate interaction between the energy and technology research community with an aim to foster unified development in the next generation. Monitoring the mechanical impact of a loose (detached or drifting) part in the reactor coolant system of a nuclear power plant is one of the essential functions for operation and maintenance of the plant. Large data tables are generated during this monitoring process. This data can be 'mined' to reveal latent patterns of interest to operation and maintenance. Rough set theory has been applied successfully to data mining. It can be used in the nuclear power industry and elsewhere to identify classes in datasets, finding dependencies in relations and discovering rules which are hidden in databases. An important role may be played by nuclear energy, provided that major safety, waste and proliferation issues affecting current nuclear reactors are satisfactorily addressed. In this respect, a large effort is under way since a few years towards the development of advanced nuclear systems that would use

  10. Use of Soft Computing Technologies For Rocket Engine Control

    Science.gov (United States)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  11. Experimental and Computational Techniques in Soft Condensed Matter Physics

    Science.gov (United States)

    Olafsen, Jeffrey

    2010-09-01

    1. Microscopy of soft materials Eric R. Weeks; 2. Computational methods to study jammed Systems Carl F. Schrek and Corey S. O'Hern; 3. Soft random solids: particulate gels, compressed emulsions and hybrid materials Anthony D. Dinsmore; 4. Langmuir monolayers Michael Dennin; 5. Computer modeling of granular rheology Leonardo E. Silbert; 6. Rheological and microrheological measurements of soft condensed matter John R. de Bruyn and Felix K. Oppong; 7. Particle-based measurement techniques for soft matter Nicholas T. Ouellette; 8. Cellular automata models of granular flow G. William Baxter; 9. Photoelastic materials Brian Utter; 10. Image acquisition and analysis in soft condensed matter Jeffrey S. Olafsen; 11. Structure and patterns in bacterial colonies Nicholas C. Darnton.

  12. 4th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Pant, Millie; Bansal, Jagdish; Nagar, Atulya

    2015-01-01

    This two volume book is based on the research papers presented at the 4th International Conference on Soft Computing for Problem Solving (SocProS 2014) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and healthcare, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  13. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  14. Computer simulation studies of the rheology of soft condensed matter

    International Nuclear Information System (INIS)

    Daivis, P.J.; Snook, I.K.; Matin, M.L.; Kairn, T.; McPhie, M.

    2004-01-01

    Full text: The rheology of soft condensed matter systems, such as polymer melts, polymer solutions and colloidal dispersions, is a subject of enduring interest - not only because of its importance in materials processing technology, but also because of the fascinating theoretical challenges it presents. Many of the rheological features possessed by these systems, such as normal stress differences, non-Newtonian viscosity and elasticity, are spectacularly evident on the macroscopic scale, but these properties are also crucial to emerging modern technologies such as micro- and nano-fluidics. Over the last seven years, we have studied many different aspects of the rheology of soft condensed matter systems using non-equilibrium molecular dynamics computer simulation techniques. Of particular importance, has been our development of a new algorithm for studying elongational flow, a comparison of the planar elongational and shear flow rheology of molecular fluids, our examination of the approach to the Brownian limit in colloidal fluids, and our detailed investigation of the concentration dependence of the viscosity and normal stress differences in short-chain polymer solutions. In this paper, we review the results of these investigations, discuss the current capabilities and limitations of non-equilibrium molecular dynamics simulations, and discuss our current work and future directions

  15. Optical character recognition systems for different languages with soft computing

    CERN Document Server

    Chaudhuri, Arindam; Badelia, Pratixa; K Ghosh, Soumya

    2017-01-01

    The book offers a comprehensive survey of soft-computing models for optical character recognition systems. The various techniques, including fuzzy and rough sets, artificial neural networks and genetic algorithms, are tested using real texts written in different languages, such as English, French, German, Latin, Hindi and Gujrati, which have been extracted by publicly available datasets. The simulation studies, which are reported in details here, show that soft-computing based modeling of OCR systems performs consistently better than traditional models. Mainly intended as state-of-the-art survey for postgraduates and researchers in pattern recognition, optical character recognition and soft computing, this book will be useful for professionals in computer vision and image processing alike, dealing with different issues related to optical character recognition.

  16. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  17. Soft Tissue Biomechanical Modeling for Computer Assisted Surgery

    CERN Document Server

    2012-01-01

      This volume focuses on the biomechanical modeling of biological tissues in the context of Computer Assisted Surgery (CAS). More specifically, deformable soft tissues are addressed since they are the subject of the most recent developments in this field. The pioneering works on this CAS topic date from the 1980's, with applications in orthopaedics and biomechanical models of bones. More recently, however, biomechanical models of soft tissues have been proposed since most of the human body is made of soft organs that can be deformed by the surgical gesture. Such models are much more complicated to handle since the tissues can be subject to large deformations (non-linear geometrical framework) as well as complex stress/strain relationships (non-linear mechanical framework). Part 1 of the volume presents biomechanical models that have been developed in a CAS context and used during surgery. This is particularly new since most of the soft tissues models already proposed concern Computer Assisted Planning, with ...

  18. Intelligent systems and soft computing for nuclear science and industry

    International Nuclear Information System (INIS)

    Ruan, D.; D'hondt, P.; Govaerts, P.; Kerre, E.E.

    1996-01-01

    The second international workshop on Fuzzy Logic and Intelligent Technologies in Nuclear Science (FLINS) addresses topics related to intelligent systems and soft computing for nuclear science and industry. The proceedings contain 52 papers in different fields such as radiation protection, nuclear safety (human factors and reliability), safeguards, nuclear reactor control, production processes in the fuel cycle, dismantling, waste and disposal, decision making, and nuclear reactor control. A clear link is made between theory and applications of fuzzy logic such as neural networks, expert systems, robotics, man-machine interfaces, and decision-support techniques by using modern and advanced technologies and tools. The papers are grouped in three sections. The first section (Soft computing techniques) deals with basic tools to treat fuzzy logic, neural networks, genetic algorithms, decision-making, and software used for general soft-computing aspects. The second section (Intelligent engineering systems) includes contributions on engineering problems such as knowledge-based engineering, expert systems, process control integration, diagnosis, measurements, and interpretation by soft computing. The third section (Nuclear applications) focusses on the application of soft computing and intelligent systems in nuclear science and industry

  19. Soft computing for fault diagnosis in power plants

    International Nuclear Information System (INIS)

    Ciftcioglu, O.; Turkcan, E.

    1998-01-01

    Considering the advancements in the AI technology, there arises a new concept known as soft computing. It can be defined as the processing of uncertain information with the AI methods, that refers to explicitly the methods using neural networks, fuzzy logic and evolutionary algorithms. In this respect, soft computing is a new dimension in information processing technology where linguistic information can also be processed in contrast with the classical stochastic and deterministic treatments of data. On one hand it can process uncertain/incomplete information and on the other hand it can deal with non-linearity of large-scale systems where uncertainty is particularly relevant with respect to linguistic information and incompleteness is related to fault tolerance in fault diagnosis. In this perspective, the potential role of soft computing in power plant operation is presented. (author)

  20. The First International Conference on Soft Computing and Data Mining

    CERN Document Server

    Ghazali, Rozaida; Deris, Mustafa

    2014-01-01

    This book constitutes the refereed proceedings of the First International Conference on Soft Computing and Data Mining, SCDM 2014, held in Universiti Tun Hussein Onn Malaysia, in June 16th-18th, 2014. The 65 revised full papers presented in this book were carefully reviewed and selected from 145 submissions, and organized into two main topical sections; Data Mining and Soft Computing. The goal of this book is to provide both theoretical concepts and, especially, practical techniques on these exciting fields of soft computing and data mining, ready to be applied in real-world applications. The exchanges of views pertaining future research directions to be taken in this field and the resultant dissemination of the latest research findings makes this work of immense value to all those having an interest in the topics covered.    

  1. Soft Computing in Construction Information Technology

    NARCIS (Netherlands)

    Ciftcioglu, O.; Durmisevic, S.; Sariyildiz, S.

    2001-01-01

    The last decade, civil engineering has exercised a rapidly growing interest in the application of neurally inspired computing techniques. The motive for this interest was the promises of certain information processing characteristics, which are similar to some extend, to those of human brain. The

  2. 5th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Bansal, Jagdish; Nagar, Atulya; Das, Kedar

    2016-01-01

    This two volume book is based on the research papers presented at the 5th International Conference on Soft Computing for Problem Solving (SocProS 2015) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  3. Water demand forecasting: review of soft computing methods.

    Science.gov (United States)

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  4. A new paradigm of knowledge engineering by soft computing

    CERN Document Server

    Ding, Liya

    2001-01-01

    Soft computing (SC) consists of several computing paradigms, including neural networks, fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as genetic algorithms. The integration of those constituent methodologies forms the core of SC. In addition, the synergy allows SC to incorporate human knowledge effectively, deal with imprecision and uncertainty, and learn to adapt to unknown or changing environments for better performance. Together with other modern technologies, SC and its applications exert unprecedented influence on intelligent systems that mimic hum

  5. Recent advances in the computational chemistry of soft porous crystals.

    Science.gov (United States)

    Fraux, Guillaume; Coudert, François-Xavier

    2017-06-29

    Here we highlight recent progress in the field of computational chemistry of nanoporous materials, focusing on methods and studies that address the extraordinary dynamic nature of these systems: the high flexibility of their frameworks, the large-scale structural changes upon external physical or chemical stimulation, and the presence of defects and disorder. The wide variety of behavior demonstrated in soft porous crystals, including the topical class of metal-organic frameworks, opens new challenges for computational chemistry methods at all scales.

  6. Soft Computing Applications in Optimization, Control, and Recognition

    CERN Document Server

    Castillo, Oscar

    2013-01-01

    Soft computing includes several intelligent computing paradigms, like fuzzy logic, neural networks, and bio-inspired optimization algorithms. This book describes the application of soft computing techniques to intelligent control, pattern recognition, and optimization problems. The book is organized in four main parts. The first part deals with nature-inspired optimization methods and their applications. Papers included in this part propose new models for achieving intelligent optimization in different application areas. The second part discusses hybrid intelligent systems for achieving control. Papers included in this part make use of nature-inspired techniques, like evolutionary algorithms, fuzzy logic and neural networks, for the optimal design of intelligent controllers for different kind of applications. Papers in the third part focus on intelligent techniques for pattern recognition and propose new methods to solve complex pattern recognition problems. The fourth part discusses new theoretical concepts ...

  7. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  8. Third International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Nagar, Atulya; Bansal, Jagdish

    2014-01-01

    The present book is based on the research papers presented in the 3rd International Conference on Soft Computing for Problem Solving (SocProS 2013), held as a part of the golden jubilee celebrations of the Saharanpur Campus of IIT Roorkee, at the Noida Campus of Indian Institute of Technology Roorkee, India. This book is divided into two volumes and covers a variety of topics including mathematical modelling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining etc. Particular emphasis is laid on soft computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems, which are otherwise difficult to solve by the usual and traditional methods. The book is directed ...

  9. Second International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Deep, Kusum; Pant, Millie; Bansal, Jagdish; Ray, Kanad; Gupta, Umesh

    2014-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2012), held at JK Lakshmipat University, Jaipur, India. This book provides the latest developments in the area of soft computing and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining, etc. The objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  10. Renormalization group approach to soft gluon resummation

    International Nuclear Information System (INIS)

    Forte, Stefano; Ridolfi, Giovanni

    2003-01-01

    We present a simple proof of the all-order exponentiation of soft logarithmic corrections to hard processes in perturbative QCD. Our argument is based on proving that all large logs in the soft limit can be expressed in terms of a single dimensionful variable, and then using the renormalization group to resum them. Beyond the next-to-leading log level, our result is somewhat less predictive than previous all-order resummation formulae, but it does not rely on non-standard factorization, and it is thus possibly more general. We use our result to settle issues of convergence of the resummed series, we discuss scheme dependence at the resummed level, and we provide explicit resummed expressions in various factorization schemes

  11. International Conference on Soft Computing Techniques and Engineering Application

    CERN Document Server

    Li, Xiaolong

    2014-01-01

    The main objective of ICSCTEA 2013 is to provide a platform for researchers, engineers and academicians from all over the world to present their research results and development activities in soft computing techniques and engineering application. This conference provides opportunities for them to exchange new ideas and application experiences face to face, to establish business or research relations and to find global partners for future collaboration.

  12. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    Software architecture is considered as a critical design methodology for the development of complex software. As an important step in software quality assurance, the optimal reliability allocation for software projects can be obtained by minimizing the total cost of achieving the target reliability...... or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  13. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    . Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved......Software architecture is considered as a critical design methodology for the development of complex software. As an important step in software quality assurance, the optimal reliability allocation for software projects can be obtained by minimizing the total cost of achieving the target reliability...... or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach...

  14. Soft Computing. Nové informatické paradigma, nebo módní slogan?

    Czech Academy of Sciences Publication Activity Database

    Hájek, Petr

    2000-01-01

    Roč. 79, č. 12 (2000), s. 683-685 ISSN 0042-4544 Institutional research plan: AV0Z1030915 Keywords : soft computing * fuzzy computing * neural computing * generic computing Subject RIV: BA - General Mathematics

  15. Soft Computing in Information Communication Technology Volume 2

    CERN Document Server

    2012-01-01

    This book is a collection of the accepted papers concerning soft computing in information communication technology. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Fuzzy Logic, Machine Learning, Cryptography, Pattern Recognition, Bioinformatics, Biomedical Engineering, Advancements in ICT.

  16. International Conference on Soft Computing in Information Communication Technology

    CERN Document Server

    Soft Computing in Information Communication Technology

    2012-01-01

      This is a collection of the accepted papers concerning soft computing in information communication technology. All accepted papers are subjected to strict peer-reviewing by 2 expert referees. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Neural Networks, Swarm Intelligence, Evolutionary Computing, Image Processing Internet Security, Data Security, Data Mining, Network Security and Protection of data and Cyber laws. Our sincere appreciation and thanks go to these authors for their contributions to this conference. I hope you can gain lots of useful information from the book.

  17. A Systematic Approach for Soft Sensor Development

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Renaudat, Philippe

    2007-01-01

    by a multivariate principal component analysis (PCA) approach, is used to detect outlying observations. Then, robust regression techniques are employed to derive an inferential model. A dynamic partial least squares (DPLS) model is implemented to address the issue of auto-correlation in process data and thus...

  18. Pattern recognition algorithms for data mining scalability, knowledge discovery and soft granular computing

    CERN Document Server

    Pal, Sankar K

    2004-01-01

    Pattern Recognition Algorithms for Data Mining addresses different pattern recognition (PR) tasks in a unified framework with both theoretical and experimental results. Tasks covered include data condensation, feature selection, case generation, clustering/classification, and rule generation and evaluation. This volume presents various theories, methodologies, and algorithms, using both classical approaches and hybrid paradigms. The authors emphasize large datasets with overlapping, intractable, or nonlinear boundary classes, and datasets that demonstrate granular computing in soft frameworks.Organized into eight chapters, the book begins with an introduction to PR, data mining, and knowledge discovery concepts. The authors analyze the tasks of multi-scale data condensation and dimensionality reduction, then explore the problem of learning with support vector machine (SVM). They conclude by highlighting the significance of granular computing for different mining tasks in a soft paradigm.

  19. Hard and soft computing models of composite curing process looking toward monitoring and control

    Science.gov (United States)

    Rubino, F.; Carlone, P.; Aleksendrić, D.; Ćirović, V.; Sorrentino, L.; Bellini, C.

    2016-10-01

    The curing process of thermosetting resins plays a key role on the final quality of the composite material components. Soft computing techniques proved to be an efficient method to control and optimize the curing process, replacing the conventional experimental and numerical approaches. In this paper artificial neural network (ANN) and fuzzy logic control (FLC) were implemented together to predict and control the temperature and degree of cure profile during the autoclave curing process. The obtained outcomes proved the capability of ANNs and FLC with respect to the hard computing methods.

  20. Wearable Intrinsically Soft, Stretchable, Flexible Devices for Memories and Computing.

    Science.gov (United States)

    Rajan, Krishna; Garofalo, Erik; Chiolerio, Alessandro

    2018-01-27

    A recent trend in the development of high mass consumption electron devices is towards electronic textiles (e-textiles), smart wearable devices, smart clothes, and flexible or printable electronics. Intrinsically soft, stretchable, flexible, Wearable Memories and Computing devices (WMCs) bring us closer to sci-fi scenarios, where future electronic systems are totally integrated in our everyday outfits and help us in achieving a higher comfort level, interacting for us with other digital devices such as smartphones and domotics, or with analog devices, such as our brain/peripheral nervous system. WMC will enable each of us to contribute to open and big data systems as individual nodes, providing real-time information about physical and environmental parameters (including air pollution monitoring, sound and light pollution, chemical or radioactive fallout alert, network availability, and so on). Furthermore, WMC could be directly connected to human brain and enable extremely fast operation and unprecedented interface complexity, directly mapping the continuous states available to biological systems. This review focuses on recent advances in nanotechnology and materials science and pays particular attention to any result and promising technology to enable intrinsically soft, stretchable, flexible WMC.

  1. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  2. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    Science.gov (United States)

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  4. Modeling Soft Tissue Damage and Failure Using a Combined Particle/Continuum Approach

    Science.gov (United States)

    Rausch, M. K.; Karniadakis, G. E.; Humphrey, J. D.

    2016-01-01

    Biological soft tissues experience damage and failure as a result of injury, disease, or simply age; examples include torn ligaments and arterial dissections. Given the complexity of tissue geometry and material behavior, computational models are often essential for studying both damage and failure. Yet, because of the need to account for discontinuous phenomena such as crazing, tearing, and rupturing, continuum methods are limited. Therefore, we model soft tissue damage and failure using a particle/continuum approach. Specifically, we combine continuum damage theory with Smoothed Particle Hydrodynamics (SPH). Because SPH is a meshless particle method, and particle connectivity is determined solely through a neighbor list, discontinuities can be readily modeled by modifying this list. We show, for the first time, that an anisotropic hyperelastic constitutive model commonly employed for modeling soft tissue can be conveniently implemented within a SPH framework and that SPH results show excellent agreement with analytical solutions for uniaxial and biaxial extension as well as finite element solutions for clamped uniaxial extension in 2D and 3D. We further develop a simple algorithm that automatically detects damaged particles and disconnects the spatial domain along rupture lines in 2D and rupture surfaces in 3D. We demonstrate the utility of this approach by simulating damage and failure under clamped uniaxial extension and in a peeling experiment of virtual soft tissue samples. In conclusion, SPH in combination with continuum damage theory may provide an accurate and efficient framework for modeling damage and failure in soft tissues. PMID:27538848

  5. State-of-the-art soft computing techniques in image steganography domain

    Science.gov (United States)

    Hussain, Hanizan Shaker; Din, Roshidi; Samad, Hafiza Abdul; Yaacub, Mohd Hanafizah; Murad, Roslinda; Rukhiyah, A.; Sabdri, Noor Maizatulshima

    2016-08-01

    This paper reviews major works of soft computing (SC) techniques in image steganography and watermarking in the last ten years, focusing on three main SC techniques, which are neural network, genetic algorithm, and fuzzy logic. The findings suggests that all these works applied SC techniques either during pre-processing, embedding or extracting stages or more than one of these stages. Therefore, the presence of SC techniques with their diverse approaches and strengths can help researchers in future work to attain excellent quality of image information hiding that comprises both imperceptibility and robustness.

  6. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  7. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  8. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-01-01

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively. PMID:27509495

  9. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications

    Directory of Open Access Journals (Sweden)

    Sadik Kamel Gharghan

    2016-08-01

    Full Text Available In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs. The two techniques, Neural Fuzzy Inference System (ANFIS and Artificial Neural Network (ANN, focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO, Gravitational Search Algorithm (GSA, and Backtracking Search Algorithm (BSA. The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  10. 17th Online World Conference on Soft Computing in Industrial Applications

    CERN Document Server

    Krömer, Pavel; Köppen, Mario; Schaefer, Gerald

    2014-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at WSC17, the 17th Online World Conference on Soft Computing in Industrial Applications, held from December 2012 to January 2013 on the Internet. WSC17 continues a successful series of scientific events started over a decade ago by the World Federation of Soft Computing. It brought together researchers from over the world interested in the ever advancing state of the art in the field. Continuous technological improvements make this online forum a viable gathering format for a world class conference. The aim of WSC17 was to disseminate excellent research results and contribute to building a global network of scientists interested in both theoretical foundations and practical applications of soft computing.   The 2012 edition of the Online World Conference on Soft Computing in Industrial Applications consisted of general track and special session on Continuous Features Discretization for Anomaly Intrusion Detectors...

  11. Distributed Computer-Controlled Systems: the DEAR-COTS Approach

    OpenAIRE

    P. Veríssimo; A. Casimiro; L. M. Pinho; F. Vasques; L. Rodrigues; E. Tovar

    2000-01-01

    This paper proposes a new architecture targeting real-time and reliable Distributed Computer-Controlled Systems (DCCS). This architecture provides a structured approach for the integration of soft and/or hard real-time applications with Commercial O -The-Shelf (COTS) components. The Timely Computing Base model is used as the reference model to deal with the heterogeneity of system components with respect to guaranteeing the timeliness of applications. The reliability and ava...

  12. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  13. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  14. A Bayesian approach for characterization of soft tissue viscoelasticity in acoustic radiation force imaging.

    Science.gov (United States)

    Zhao, Xiaodong; Pelegri, Assimina A

    2016-04-01

    Biomechanical imaging techniques based on acoustic radiation force (ARF) have been developed to characterize the viscoelasticity of soft tissue by measuring the motion excited by ARF non-invasively. The unknown stress distribution in the region of excitation limits an accurate inverse characterization of soft tissue viscoelasticity, and single degree-of-freedom simplified models have been applied to solve the inverse problem approximately. In this study, the ARF-induced creep imaging is employed to estimate the time constant of a Voigt viscoelastic tissue model, and an inverse finite element (FE) characterization procedure based on a Bayesian formulation is presented. The Bayesian approach aims to estimate a reasonable quantification of the probability distributions of soft tissue mechanical properties in the presence of measurement noise and model parameter uncertainty. Gaussian process metamodeling is applied to provide a fast statistical approximation based on a small number of computationally expensive FE model runs. Numerical simulation results demonstrate that the Bayesian approach provides an efficient and practical estimation of the probability distributions of time constant in the ARF-induced creep imaging. In a comparison study with the single degree of freedom models, the Bayesian approach with FE models improves the estimation results even in the presence of large uncertainty levels of the model parameters. Copyright © 2015 John Wiley & Sons, Ltd.

  15. A Spectral Finite Element Approach to Modeling Soft Solids Excited with High-Frequency Harmonic Loads.

    Science.gov (United States)

    Brigham, John C; Aquino, Wilkins; Aguilo, Miguel A; Diamessis, Peter J

    2011-01-15

    An approach for efficient and accurate finite element analysis of harmonically excited soft solids using high-order spectral finite elements is presented and evaluated. The Helmholtz-type equations used to model such systems suffer from additional numerical error known as pollution when excitation frequency becomes high relative to stiffness (i.e. high wave number), which is the case, for example, for soft tissues subject to ultrasound excitations. The use of high-order polynomial elements allows for a reduction in this pollution error, but requires additional consideration to counteract Runge's phenomenon and/or poor linear system conditioning, which has led to the use of spectral element approaches. This work examines in detail the computational benefits and practical applicability of high-order spectral elements for such problems. The spectral elements examined are tensor product elements (i.e. quad or brick elements) of high-order Lagrangian polynomials with non-uniformly distributed Gauss-Lobatto-Legendre nodal points. A shear plane wave example is presented to show the dependence of the accuracy and computational expense of high-order elements on wave number. Then, a convergence study for a viscoelastic acoustic-structure interaction finite element model of an actual ultrasound driven vibroacoustic experiment is shown. The number of degrees of freedom required for a given accuracy level was found to consistently decrease with increasing element order. However, the computationally optimal element order was found to strongly depend on the wave number.

  16. A Spectral Finite Element Approach to Modeling Soft Solids Excited with High-Frequency Harmonic Loads

    Science.gov (United States)

    Brigham, John C.; Aquino, Wilkins; Aguilo, Miguel A.; Diamessis, Peter J.

    2010-01-01

    An approach for efficient and accurate finite element analysis of harmonically excited soft solids using high-order spectral finite elements is presented and evaluated. The Helmholtz-type equations used to model such systems suffer from additional numerical error known as pollution when excitation frequency becomes high relative to stiffness (i.e. high wave number), which is the case, for example, for soft tissues subject to ultrasound excitations. The use of high-order polynomial elements allows for a reduction in this pollution error, but requires additional consideration to counteract Runge's phenomenon and/or poor linear system conditioning, which has led to the use of spectral element approaches. This work examines in detail the computational benefits and practical applicability of high-order spectral elements for such problems. The spectral elements examined are tensor product elements (i.e. quad or brick elements) of high-order Lagrangian polynomials with non-uniformly distributed Gauss-Lobatto-Legendre nodal points. A shear plane wave example is presented to show the dependence of the accuracy and computational expense of high-order elements on wave number. Then, a convergence study for a viscoelastic acoustic-structure interaction finite element model of an actual ultrasound driven vibroacoustic experiment is shown. The number of degrees of freedom required for a given accuracy level was found to consistently decrease with increasing element order. However, the computationally optimal element order was found to strongly depend on the wave number. PMID:21461402

  17. 10th International Conference on Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Sedano, Javier; Baruque, Bruno; Quintián, Héctor; Corchado, Emilio

    2015-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at the 10th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2015), held in the beautiful and historic city of Burgos (Spain), in June 2015. Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena. This Conference is mainly focused on its industrial and environmental applications. After a through peer-review process, the SOCO 2015 International Program Committee selected 41 papers, written by authors from 15 different countries. These papers are published in present conference proceedings, achieving an acceptance rate of 40%. The selection of papers was extremely rigorous in order to maintain the high quality of the conference and we would like to thank the members of the International Program Committees ...

  18. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    Science.gov (United States)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by

  19. Soft matter approaches to structured foods: from "cook-and-look" to rational food design?

    Science.gov (United States)

    Ubbink, Job

    2012-01-01

    Developments in soft matter physics are discussed within the context of food structuring. An overview is given of soft matter-based approaches used in food, and a relation is established between soft matter approaches and food technology, food creation, product development and nutrition. Advances in food complexity and food sustainability are discussed from a physical perspective, and the potential for future developments is highlighted.

  20. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Science.gov (United States)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  1. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Directory of Open Access Journals (Sweden)

    J. Bhardwaj

    2018-02-01

    Full Text Available New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  2. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  3. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  4. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  5. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  6. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  7. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  8. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  9. Family planning and the labor sector: soft-sell approach.

    Science.gov (United States)

    Teston, R C

    1981-01-01

    Dr. Cesar T. San Pedro, the director of the company clinic at Dole Philippines plantation in South Cotabato in Region 11, has been pressing the management to initiate a comprehensive family planning programs for their 10,000 workers. Pedro wants the Ministry of Labor and Employment (MOLE) to enforce its population program. The situation at Dole is one that requires an arbiter. Since 1977, there has not been a Population/Family Planning Officer (PFPO) for the area, and it is not possible to monitor closely if the qualified firms are following the labor code and providing family planning services to their employees. Susan B. Dedel, executive director of the PFPO, has reported that the office has sought to endear its program to the private sector by showing that family planning is also profitable for the firm. This "soft-sell" approach has been the hallmark of the MOLE-PFPO since it began in 1975 as a joint project of the Commission on Population (POPCOM), United Nations Fund for Population Activities (UNFPA), and International Labor Organization (ILO). Some critics have argued that this liberal style of implementation is short-selling the program. They point out that the Labor Code of 1973 enforces all establishments with at least 200 employees to have a free in-plant family planning program which includes clinic care, paid motivators, and volunteer population workers. The critics seem, at 1st glance, to have the statistics on their side. In its 5 years of operation, the PFPO has convinced only 137,000 workers to accept family planning. This is quite low, since of the 1.2 million employed by the covered firms, 800,000 are eligible for the MOLE program. Much of the weakness of the implementation is said to be due to the slow activation of the Labor-Management Coordinating Committees (LMCC). The critics maintain that because of the liberal enforcement of Department Order No. 9, the recalcitrant firms see no reason to comply. Dedel claims that the program is on the

  10. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  11. Efficient Buffer Capacity and Scheduler Setting Computation for Soft Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Bekooij, Marco; Bekooij, Marco Jan Gerrit; Wiggers, M.H.; van Meerbergen, Jef; Falk, H.; Marwedel, P.

    2007-01-01

    Soft real-time applications that process data streams can often be intuitively described as dataflow process networks. In this paper we present a novel analysis technique to compute conservative estimates of the required buffer capacities in such process networks. With the same analysis technique

  12. A Case for Soft Error Detection and Correction in Computational Chemistry.

    Science.gov (United States)

    van Dam, Hubertus J J; Vishnu, Abhinav; de Jong, Wibe A

    2013-09-10

    High performance computing platforms are expected to deliver 10(18) floating operations per second by the year 2022 through the deployment of millions of cores. Even if every core is highly reliable the sheer number of them will mean that the mean time between failures will become so short that most application runs will suffer at least one fault. In particular soft errors caused by intermittent incorrect behavior of the hardware are a concern as they lead to silent data corruption. In this paper we investigate the impact of soft errors on optimization algorithms using Hartree-Fock as a particular example. Optimization algorithms iteratively reduce the error in the initial guess to reach the intended solution. Therefore they may intuitively appear to be resilient to soft errors. Our results show that this is true for soft errors of small magnitudes but not for large errors. We suggest error detection and correction mechanisms for different classes of data structures. The results obtained with these mechanisms indicate that we can correct more than 95% of the soft errors at moderate increases in the computational cost.

  13. Measurement of facial soft tissues thickness using 3D computed tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Ho Gul; Kim, Kee Deog; Shin, Dong Won; Hu, Kyung Seok; Lee, Jae Bum; Park, Hyok; Park, Chang Seo [Yonsei Univ. Hospital, Seoul (Korea, Republic of); Han, Seung Ho [Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2006-03-15

    To evaluate accuracy and reliability of program to measure facial soft tissue thickness using 3D computed tomographic images by comparing with direct measurement. One cadaver was scanned with a Helical CT with 3 mm slice thickness and 3 mm/sec table speed. The acquired data was reconstructed with 1.5 mm reconstruction interval and the images were transferred to a personal computer. The facial soft tissue thickness were measured using a program developed newly in 3D image. For direct measurement, the cadaver was cut with a bone cutter and then a ruler was placed above the cut side. The procedure was followed by taking pictures of the facial soft tissues with a high-resolution digital camera. Then the measurements were done in the photographic images and repeated for ten times. A repeated measure analysis of variance was adopted to compare and analyze the measurements resulting from the two different methods. Comparison according to the areas was analyzed by Mann-Whitney test. There were no statistically significant differences between the direct measurements and those using the 3D images(p>0.05). There were statistical differences in the measurements on 17 points but all the points except 2 points showed a mean difference of 0.5 mm or less. The developed software program to measure the facial soft tissue thickness using 3D images was so accurate that it allows to measure facial soft tissue thickness more easily in forensic science and anthropology.

  14. Measurement of facial soft tissues thickness using 3D computed tomographic images

    International Nuclear Information System (INIS)

    Jeong, Ho Gul; Kim, Kee Deog; Shin, Dong Won; Hu, Kyung Seok; Lee, Jae Bum; Park, Hyok; Park, Chang Seo; Han, Seung Ho

    2006-01-01

    To evaluate accuracy and reliability of program to measure facial soft tissue thickness using 3D computed tomographic images by comparing with direct measurement. One cadaver was scanned with a Helical CT with 3 mm slice thickness and 3 mm/sec table speed. The acquired data was reconstructed with 1.5 mm reconstruction interval and the images were transferred to a personal computer. The facial soft tissue thickness were measured using a program developed newly in 3D image. For direct measurement, the cadaver was cut with a bone cutter and then a ruler was placed above the cut side. The procedure was followed by taking pictures of the facial soft tissues with a high-resolution digital camera. Then the measurements were done in the photographic images and repeated for ten times. A repeated measure analysis of variance was adopted to compare and analyze the measurements resulting from the two different methods. Comparison according to the areas was analyzed by Mann-Whitney test. There were no statistically significant differences between the direct measurements and those using the 3D images(p>0.05). There were statistical differences in the measurements on 17 points but all the points except 2 points showed a mean difference of 0.5 mm or less. The developed software program to measure the facial soft tissue thickness using 3D images was so accurate that it allows to measure facial soft tissue thickness more easily in forensic science and anthropology

  15. An Integrated Computational and Data Environment to Support Multiscale Modeling of Soft Materials for the Materials Genome Initiative

    Science.gov (United States)

    Phelan, Frederick, Jr.; Rosch, Thomas; Jeong, Cheol; Moroz, Brian; Youssef, Sharief

    In this presentation, we describe the development of a computational ``workbench'' whose goal is to provide an integrated computational and data environment to support multiscale modeling of soft materials for the Materials Genome Initiative (MGI). The design has three essential elements: a modular program structure that supports the addition of new functionality through Python scripting and run-time plugins; a hierarchical data structure which enables unified representation of materials at different levels of granularity; finally, integration of the NIST Materials Data Curation System (MDCS) into the environment to support ontology based materials descriptions. The feature of the workbench which we emphasize in this presentation is coarse-graining. Coarse-graining techniques are an essential requirement for the design of soft materials, and are an active area of research across the soft matter community. We illustrate how the approach allows the integration of multiple coarse-graining techniques in a common environment to greater enable development, evaluation and comparison of new algorithms. Moreover, the environment meets the goals of the MGI by enabling automated curation of both upstream and downstream data in materials reference libraries which can be pushed or shared by various means. Present address: Johns Hopkins Applied Physics Laboratory Space Technologies and Applied Research Laurel, MD.

  16. The Soft Constraints Hypothesis: A Rational Analysis Approach to Resource Allocation for Interactive Behavior

    National Research Council Canada - National Science Library

    Gray, Wayne D; Sims, Chris R; Schoelles, Michael J; Fu, Wai-Tat

    2006-01-01

    Soft constraints hypothesis (SCH) is a rational analysis approach that holds that the mixture of perceptual-motor and cognitive resources allocated for interactive behavior is adjusted based on temporal cost-benefit tradeoff...

  17. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...... variables by few independent factors. The number of factors which can be extracted from a correlation matrix is a reliable criterion for inherent independent information in that matrix. Several data sets were analyzed, which were gained from the Aphasia Database, such as different groups of patients, groups...

  18. A Database Approach to Computer Integrated Manufacturing

    Science.gov (United States)

    1988-06-01

    BACKGROUND .......................................................................................... 28 B. WHAT IS COMPUTER INTEGRATED MANUFACTURING...CIM is a modular approach to integration. B. WHAT IS COMPUTER INTEGRATED MANUFACTURING? There is a wide diversity of definitions of CIM in the

  19. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    Science.gov (United States)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  20. A novel algorithm for prediction of crude oil price variation based on soft computing

    International Nuclear Information System (INIS)

    Ghaffari, Ali; Zare, Samaneh

    2009-01-01

    In this paper a method based on soft computing approaches is developed to predict the daily variation of the crude oil price of the West Texas Intermediate (WTI). The predicted daily oil price variation is compared with the actual daily variation of the oil price and the difference is implemented to activate the learning algorithms. In order to reduce the effect of unpredictable short term disturbances, a data filtering algorithm is used. In this paper, the prediction is called ''true'' if the predicted variation of the oil price has the same sign as the actual variation, otherwise the prediction is ''false''. It is shown that for several randomly selected durations, the true prediction is considerably higher than the result of most recent published prediction algorithms. To ensure the accuracy and reliability of the algorithm, several on line predictions are executed during one complete month. The on line results indicate that the true predictions are consistently the same percentage for periods of one month. (author)

  1. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  2. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  3. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  4. New trends on soft computing models in industrial and environmental applications

    OpenAIRE

    Corchado Rodríguez, Emilio; Abraham, Ajith P.; Snášel, Václav

    2017-01-01

    The twelve papers included in this special issue represent a selection of extended contributions presented at the Sixth International Conference on Soft Computing Models in Industrial and Environmental Applications, held in Salamanca, Spain, 6–8th April, 2011. Papers were selected on the basis of fundamental ideas and concepts rather than the direct usage of well-established techniques. This special issue is then aimed at practitioners, researchers and post-graduate students, who are engaged ...

  5. LONG TERM WIND SPEED PREDICTION USING WAVELET COEFFICIENTS AND SOFT COMPUTING

    Directory of Open Access Journals (Sweden)

    Manju Khanna

    2016-10-01

    Full Text Available In the past researches, scholars have carried out short-term prediction for wind speed. The present work deals with long-term wind speed prediction, required for hybrid power generation design and contract planning. As the total database is quite large for long-term prediction, feature extraction of data by application of Lifting wavelet coefficients is exploited, along with soft computing techniques for time series data, which is scholastic in nature.

  6. Claudio Moraga a passion for multi-valued logic and soft computing

    CERN Document Server

    Allende-Cid, Héctor

    2017-01-01

    The book is an authoritative collection of contributions by leading experts on the topics of fuzzy logic, multi-valued logic and neural network. Originally written as an homage to Claudio Moraga, seen by his colleagues as an example of concentration, discipline and passion for science, the book also represents a timely reference guide for advance students and researchers in the field of soft computing, and multiple-valued logic. .

  7. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    OpenAIRE

    Lukas Falat; Dusan Marcek; Maria Durisova

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the sug...

  8. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  9. A Novel Approach to Determine the Prevalence of Type of Soft Palate Using Digital Intraoral Impression

    Science.gov (United States)

    Khaled Addas, Mohamed; Al Humaidi, Abdullah Saad Ali; Al Qahtani, Abdulrazaq Mohammed; Al Qahtani, Mubarak Daghash

    2017-01-01

    Aim To determine the prevalence of type of soft palate in targeted population. Materials and Methods Using computer technology in dentistry, intraoral digital scanner, and 3D analysis software tool, study was conducted. 100 patients selected from the outpatient clinics were divided into two groups based on the ages of 20–40 years and 41–60 years with equal ratio of males and females. Each selected patient's maxillary arch was scanned with intraoral scanner; images so obtained were sectioned in anteroposterior cross section and with the 3D analysis software; the angulation between hard and soft palate was determined. Results The prevalence of type II soft palate (angulation between hard and soft palate is between 10 and 45 degrees) was highest, 60% in group 1 and 44% in group 2. The difference between genders was statistically significant with p value <0.05 in both the groups, although females had higher angulation compared to the males in all classes of both groups. Conclusions In targeted population of Aseer Province, Saudi Arabia, the prevalence of type II soft palate was more common, with higher soft palate angulation among females. The advanced age had no effect in the type of soft palate in the region. PMID:28951740

  10. A soft double regularization approach to parametric blind image deconvolution.

    Science.gov (United States)

    Chen, Li; Yap, Kim-Hui

    2005-05-01

    This paper proposes a blind image deconvolution scheme based on soft integration of parametric blur structures. Conventional blind image deconvolution methods encounter a difficult dilemma of either imposing stringent and inflexible preconditions on the problem formulation or experiencing poor restoration results due to lack of information. This paper attempts to address this issue by assessing the relevance of parametric blur information, and incorporating the knowledge into the parametric double regularization (PDR) scheme. The PDR method assumes that the actual blur satisfies up to a certain degree of parametric structure, as there are many well-known parametric blurs in practical applications. Further, it can be tailored flexibly to include other blur types if some prior parametric knowledge of the blur is available. A manifold soft parametric modeling technique is proposed to generate the blur manifolds, and estimate the fuzzy blur structure. The PDR scheme involves the development of the meaningful cost function, the estimation of blur support and structure, and the optimization of the cost function. Experimental results show that it is effective in restoring degraded images under different environments.

  11. MANAGEMENT APPROACH BETWEEN BUSINESS CLUSTER SUCCESS AND SOFT LEADER CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Robert Lippert

    2014-05-01

    Full Text Available One of the potential aspects of economic growth lies in focusing on furtherance the development of business clusters. By linking the complementary competencies of profit oriented enterprises, NGO-s, universities, research institutes and local authorities, the innovation potential and the productivity are significantly increased. The present study investigates a specific and challenging managerial activity, the role of the cluster manager. The aim of the research is to reveal the intrinsic motivation of cluster operations and to demonstrate the importance of the manager in the efficient and sustainable operation. An empirical research has been conducted involving cluster managers and member organisations through an extensive questionnaire survey in Hungary. First, determinant factors of cluster success have been identified. By using these factors, as the operational activity of the cluster, as well as the satisfaction of the members in the field of innovation and productivity, a new continuous three-dimensional maturity model has been introduced to evaluate the cluster success. Mapping the soft factors, organisational culture and leadership roles have been assessed by applying Competing Values Framework method. The results of the research depict the correlation found between soft leader characteristics and cluster success.

  12. Synopsis of Soft Computing Techniques used in Quadrotor UAV Modelling and Control

    Directory of Open Access Journals (Sweden)

    Attila Nemes

    2015-01-01

    Full Text Available The aim of this article is to give an introduction to quadrotor systems with an overview of soft computing techniques used in quadrotor unmanned aerial vehicle (UAV control, modelling, object following and collision avoidance. The quadrotor system basics, its structure and dynamic model definitions are recapitulated. Further on synopsis is given of previously proposed methods, results evaluated and conclusions drown by authors of referenced publications. The result of this article is a summary of multiple papers on fuzzy logic techniques used in position and altitude control systems for UAVs. Also an overview of fuzzy system based visual servoing for object tracking and collision avoidance is given together with a briefing of quadrotor UAV control techniques efficiency study. Conclusion is that though soft computing methods are widely used with good results, there is still place for much research to be done on find more efficient soft computing tools for simple modelling, robust dynamic control and fast collision avoidance in quadrotor UAV control.

  13. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  14. Modification of Hazen's equation in coarse grained soils by soft computing techniques

    Science.gov (United States)

    Kaynar, Oguz; Yilmaz, Isik; Marschalko, Marian; Bednarik, Martin; Fojtova, Lucie

    2013-04-01

    Hazen first proposed a Relationship between coefficient of permeability (k) and effective grain size (d10) was first proposed by Hazen, and it was then extended by some other researchers. However many attempts were done for estimation of k, correlation coefficients (R2) of the models were generally lower than ~0.80 and whole grain size distribution curves were not included in the assessments. Soft computing techniques such as; artificial neural networks, fuzzy inference systems, genetic algorithms, etc. and their hybrids are now being successfully used as an alternative tool. In this study, use of some soft computing techniques such as Artificial Neural Networks (ANNs) (MLP, RBF, etc.) and Adaptive Neuro-Fuzzy Inference System (ANFIS) for prediction of permeability of coarse grained soils was described, and Hazen's equation was then modificated. It was found that the soft computing models exhibited high performance in prediction of permeability coefficient. However four different kinds of ANN algorithms showed similar prediction performance, results of MLP was found to be relatively more accurate than RBF models. The most reliable prediction was obtained from ANFIS model.

  15. In vivo X-Ray Computed Tomographic Imaging of Soft Tissue with Native, Intravenous, or Oral Contrast

    Science.gov (United States)

    Wathen, Connor A.; Foje, Nathan; van Avermaete, Tony; Miramontes, Bernadette; Chapaman, Sarah E.; Sasser, Todd A.; Kannan, Raghuraman; Gerstler, Steven; Leevy, W. Matthew

    2013-01-01

    X-ray Computed Tomography (CT) is one of the most commonly utilized anatomical imaging modalities for both research and clinical purposes. CT combines high-resolution, three-dimensional data with relatively fast acquisition to provide a solid platform for non-invasive human or specimen imaging. The primary limitation of CT is its inability to distinguish many soft tissues based on native contrast. While bone has high contrast within a CT image due to its material density from calcium phosphate, soft tissue is less dense and many are homogenous in density. This presents a challenge in distinguishing one type of soft tissue from another. A couple exceptions include the lungs as well as fat, both of which have unique densities owing to the presence of air or bulk hydrocarbons, respectively. In order to facilitate X-ray CT imaging of other structures, a range of contrast agents have been developed to selectively identify and visualize the anatomical properties of individual tissues. Most agents incorporate atoms like iodine, gold, or barium because of their ability to absorb X-rays, and thus impart contrast to a given organ system. Here we review the strategies available to visualize lung, fat, brain, kidney, liver, spleen, vasculature, gastrointestinal tract, and liver tissues of living mice using either innate contrast, or commercial injectable or ingestible agents with selective perfusion. Further, we demonstrate how each of these approaches will facilitate the non-invasive, longitudinal, in vivo imaging of pre-clinical disease models at each anatomical site. PMID:23711461

  16. Digital dissection - using contrast-enhanced computed tomography scanning to elucidate hard- and soft-tissue anatomy in the Common Buzzard Buteo buteo.

    Science.gov (United States)

    Lautenschlager, Stephan; Bright, Jen A; Rayfield, Emily J

    2014-04-01

    Gross dissection has a long history as a tool for the study of human or animal soft- and hard-tissue anatomy. However, apart from being a time-consuming and invasive method, dissection is often unsuitable for very small specimens and often cannot capture spatial relationships of the individual soft-tissue structures. The handful of comprehensive studies on avian anatomy using traditional dissection techniques focus nearly exclusively on domestic birds, whereas raptorial birds, and in particular their cranial soft tissues, are essentially absent from the literature. Here, we digitally dissect, identify, and document the soft-tissue anatomy of the Common Buzzard (Buteo buteo) in detail, using the new approach of contrast-enhanced computed tomography using Lugol's iodine. The architecture of different muscle systems (adductor, depressor, ocular, hyoid, neck musculature), neurovascular, and other soft-tissue structures is three-dimensionally visualised and described in unprecedented detail. The three-dimensional model is further presented as an interactive PDF to facilitate the dissemination and accessibility of anatomical data. Due to the digital nature of the data derived from the computed tomography scanning and segmentation processes, these methods hold the potential for further computational analyses beyond descriptive and illustrative proposes. © 2013 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  17. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  18. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  19. Computation of stress on the surface of a soft homogeneous arbitrarily shaped particle

    Science.gov (United States)

    Yang, Minglin; Ren, Kuan Fang; Wu, Yueqian; Sheng, Xinqing

    2014-04-01

    Prediction of the stress on the surface of an arbitrarily shaped particle of soft material is essential in the study of elastic properties of the particles with optical force. It is also necessary in the manipulation and sorting of small particles with optical tweezers, since a regular-shaped particle, such as a sphere, may be deformed under the nonuniform optical stress on its surface. The stress profile on a spherical or small spheroidal soft particle trapped by shaped beams has been studied, however little work on computing the surface stress of an irregular-shaped particle has been reported. We apply in this paper the surface integral equation with multilevel fast multipole algorithm to compute the surface stress on soft homogeneous arbitrarily shaped particles. The comparison of the computed stress profile with that predicted by the generalized Lorenz-Mie theory for a water droplet of diameter equal to 51 wavelengths in a focused Gaussian beam show that the precision of our method is very good. Then stress profiles on spheroids with different aspect ratios are computed. The particles are illuminated by a Gaussian beam of different waist radius at different incidences. Physical analysis on the mechanism of optical stress is given with help of our recently developed vectorial complex ray model. It is found that the maximum of the stress profile on the surface of prolate spheroids is not only determined by the reflected and refracted rays (orders p =0,1) but also the rays undergoing one or two internal reflections where they focus. Computational study of stress on surface of a biconcave cell-like particle, which is a typical application in life science, is also undertaken.

  20. MRT letter: Contrast-enhanced computed tomographic imaging of soft callus formation in fracture healing.

    Science.gov (United States)

    Hayward, Lauren Nicole Miller; de Bakker, Chantal Marie-Jeanne; Lusic, Hrvoje; Gerstenfeld, Louis Charles; Grinstaff, Mark W; Morgan, Elise Feng-I

    2012-01-01

    Formation of a cartilaginous soft callus at the site of a bone fracture is a pivotal stage in the healing process. Noninvasive, or even nondestructive, imaging of soft callus formation can be an important tool in experimental and pre-clinical studies of fracture repair. However, the low X-ray attenuation of cartilage renders the soft callus nearly invisible in radiographs. This study utilized a recently developed, cationic, iodinated contrast agent in conjunction with micro-computed tomography to identify cartilage in fracture calluses in the femora of C57BL/6J and C3H/HeJ mice. Fracture calluses were scanned before and after incubation in the contrast agent. The set of pre-incubation images was registered against and then subtracted from the set of post-incubation images, resulting in a three-dimensional map of the locations of cartilage in the callus, as labeled by the contrast agent. This map was then compared to histology from a previous study. The results showed that the locations where the contrast agent collected in relatively high concentrations were similar to those of the cartilage. The contrast agent also identified a significant difference between the two strains of mice in the percentage of the callus occupied by cartilage, indicating that this method of contrast-enhanced computed tomography may be an effective technique for nondestructive, early evaluation of fracture healing. Copyright © 2011 Wiley Periodicals, Inc.

  1. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training.

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-21

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant ("skin-like") electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  2. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  3. Soft computing approaches to uncertainty propagation in environmental risk mangement

    OpenAIRE

    Kumar, Vikas

    2008-01-01

    Los problemas del mundo real, especialmente aquellos que implican sistemas naturales, son complejos y se componen de muchos componentes indeterminados, que muestran en muchos casos una relación no lineal. Los modelos convencionales basados en técnicas analíticas que se utilizan actualmente para conocer y predecir el comportamiento de dichos sistemas pueden ser muy complicados e inflexibles cuando se quiere hacer frente a la imprecisión y la complejidad del sistema en un mundo real. El tratami...

  4. A Soft Systems Approach Case Study. Faustin Kamuzora Abstrac

    African Journals Online (AJOL)

    Kamuzora

    information and communications technologies (ICTs) to enable community processes such as local economic development. ..... Hotels/ Hostels and. Guests Houses. Can we improve our information management skills? Will learning computer skills using PCs from. UoB improve FoUS? Mombo. Junction. To Lushoto.

  5. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2007-01-01

    The era of seemingly unlimited growth in processor performance is over: single chip architectures can no longer overcome the performance limitations imposed by the power they consume and the heat they generate. Today, Intel and other semiconductor firms are abandoning the single fast processor model in favor of multi-core microprocessors--chips that combine two or more processors in a single package. In the fourth edition of Computer Architecture, the authors focus on this historic shift, increasing their coverage of multiprocessors and exploring the most effective ways of achieving parallelis

  6. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  7. A program to compute the soft Robinson-Foulds distance between phylogenetic networks.

    Science.gov (United States)

    Lu, Bingxin; Zhang, Louxin; Leong, Hon Wai

    2017-03-14

    Over the past two decades, phylogenetic networks have been studied to model reticulate evolutionary events. The relationships among phylogenetic networks, phylogenetic trees and clusters serve as the basis for reconstruction and comparison of phylogenetic networks. To understand these relationships, two problems are raised: the tree containment problem, which asks whether a phylogenetic tree is displayed in a phylogenetic network, and the cluster containment problem, which asks whether a cluster is represented at a node in a phylogenetic network. Both the problems are NP-complete. A fast exponential-time algorithm for the cluster containment problem on arbitrary networks is developed and implemented in C. The resulting program is further extended into a computer program for fast computation of the Soft Robinson-Foulds distance between phylogenetic networks. Two computer programs are developed for facilitating reconstruction and validation of phylogenetic network models in evolutionary and comparative genomics. Our simulation tests indicated that they are fast enough for use in practice. Additionally, the distribution of the Soft Robinson-Foulds distance between phylogenetic networks is demonstrated to be unlikely normal by our simulation data.

  8. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    Science.gov (United States)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  9. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  10. Appraisal of soft computing techniques in prediction of total bed material load in tropical rivers

    Science.gov (United States)

    Chang, C. K.; Azamathulla, H. Md; Zakaria, N. A.; Ghani, A. Ab

    2012-02-01

    This paper evaluates the performance of three soft computing techniques, namely Gene-Expression Programming (GEP) (Zakaria et al 2010), Feed Forward Neural Networks (FFNN) (Ab Ghani et al 2011), and Adaptive Neuro-Fuzzy Inference System (ANFIS) in the prediction of total bed material load for three Malaysian rivers namely Kurau, Langat and Muda. The results of present study are very promising: FFNN ( R 2 = 0.958, RMSE = 0.0698), ANFIS ( R 2 = 0.648, RMSE = 6.654), and GEP ( R 2 = 0.97, RMSE = 0.057), which support the use of these intelligent techniques in the prediction of sediment loads in tropical rivers.

  11. Live theater on a virtual stage: incorporating soft skills and teamwork in computer graphics education.

    Science.gov (United States)

    Schweppe, M; Geigel, J

    2011-01-01

    Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.

  12. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  13. Cognitive Approaches for Medicine in Cloud Computing.

    Science.gov (United States)

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  14. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    Science.gov (United States)

    Abadjiev, Valentin; Kawasaki, Haruhisa

    2014-09-01

    The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.

  15. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  16. In vivo X-Ray Computed Tomographic Imaging of Soft Tissue with Native, Intravenous, or Oral Contrast

    Directory of Open Access Journals (Sweden)

    W. Matthew Leevy

    2013-05-01

    Full Text Available X-ray Computed Tomography (CT is one of the most commonly utilized anatomical imaging modalities for both research and clinical purposes. CT combines high-resolution, three-dimensional data with relatively fast acquisition to provide a solid platform for non-invasive human or specimen imaging. The primary limitation of CT is its inability to distinguish many soft tissues based on native contrast. While bone has high contrast within a CT image due to its material density from calcium phosphate, soft tissue is less dense and many are homogenous in density. This presents a challenge in distinguishing one type of soft tissue from another. A couple exceptions include the lungs as well as fat, both of which have unique densities owing to the presence of air or bulk hydrocarbons, respectively. In order to facilitate X-ray CT imaging of other structures, a range of contrast agents have been developed to selectively identify and visualize the anatomical properties of individual tissues. Most agents incorporate atoms like iodine, gold, or barium because of their ability to absorb X-rays, and thus impart contrast to a given organ system. Here we review the strategies available to visualize lung, fat, brain, kidney, liver, spleen, vasculature, gastrointestinal tract, and liver tissues of living mice using either innate contrast, or commercial injectable or ingestible agents with selective perfusion. Further, we demonstrate how each of these approaches will facilitate the non-invasive, longitudinal, in vivo imaging of pre-clinical disease models at each anatomical site.

  17. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    Science.gov (United States)

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  18. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Borsic, A; Attardo, E A; Halter, R J

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  19. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    able to recognize the strong correlation between the displacement mechanism and the reservoir characteristics as they effectively forecast hydrocarbon production for different types of reservoir undergoing diverse recovery processes. The artificial neuron networks are able to capture the similarities between different displacement mechanisms as same network architecture is successfully applied in both CO2 and N2 injection. The neuro-simulation application tool is built within a graphical user interface to facilitate the display of the results. The developed soft-computing tool offers an innovative approach to design a variety of efficient and feasible IOR processes by using artificial intelligence. The tool provides appropriate guidelines to the reservoir engineer, it facilitates the appraisal of diverse field development strategies for oil reservoirs, and it helps to reduce the number of scenarios evaluated with conventional reservoir simulation.

  20. Finding-specific display presets for computed radiography soft-copy reading.

    Science.gov (United States)

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding

  1. Russian Approach to Soft Power Promotion: Conceptual Approaches in Foreign Policy

    Directory of Open Access Journals (Sweden)

    Yulia Nikitina

    2014-01-01

    Full Text Available Foreign policy is one of the instruments of promoting soft power of a state. According to Joseph Nye, civil society is the main source of a state's international attractiveness. The article analyses how Russian official foreign policy documents present interaction between the state and civil society in order to promote Russian soft power. At the present stage Russian civil society is perceived by state structures as an instrument and not a source of soft power. The article also analyses political values and models of developments as elements of soft power as they are presented in official documents. Russia has a coherent normative model of regional development for the post-Soviet space. For the global level Russia formulates rules of behavior that it would like to see at the international arena, but Russia does not formulate how Russian or regional post-Soviet models of development can contribute to world development.

  2. Computational fluid dynamics a practical approach

    CERN Document Server

    Tu, Jiyuan; Liu, Chaoqun

    2018-01-01

    Computational Fluid Dynamics: A Practical Approach, Third Edition, is an introduction to CFD fundamentals and commercial CFD software to solve engineering problems. The book is designed for a wide variety of engineering students new to CFD, and for practicing engineers learning CFD for the first time. Combining an appropriate level of mathematical background, worked examples, computer screen shots, and step-by-step processes, this book walks the reader through modeling and computing, as well as interpreting CFD results. This new edition has been updated throughout, with new content and improved figures, examples and problems.

  3. Comprehensive and Global Approach of Soft-Tissue Deformities in Craniofacial Neurofibromatosis Type 1.

    Science.gov (United States)

    Denadai, Rafael; Buzzo, Celso Luiz; Takata, Joao Paulo Issamu; Raposo-Amaral, Cesar Augusto; Raposo-Amaral, Cassio Eduardo

    2016-08-01

    To present a single-institution experience in the comprehensive and global soft-tissue surgical approach of patients with craniofacial neurofibromatosis type 1 (NF-1). A retrospective analysis of patients with craniofacial NF-1 (n = 20) who underwent craniofacial soft-tissue reconstruction between 1993 and 2014 was conducted. Surgical treatment was individualized according to age, functional and/or aesthetic impairment, neurofibroma types, anatomical location, size, and patient/family and surgical team preferences, regardless of previously published compartmental grading systems. The surgical results were classified based on 2 previously published outcome rating scales (craniofacial symmetry improvement and need for additional surgery). All patients underwent en bloc translesional surgical excisions, 12 facial suspension, 3 eyebrow suspension, 2 ear suspension, 9 lateral canthopexy, 5 horizontal shortening of the tarsus of lower eyelid, and 1 horizontal shortening of the tarsus of upper eyelid. The degree of craniofacial symmetry improvement was considered "mostly satisfactory" (75%), and the overall rate of surgical results ranked according to the need for additional surgery was 2.4 ± 0.50, with variations according to the spectrum of soft-tissue involvement. According to the experience and surgical outcomes presented in this study, the soft-tissue surgical approach of the craniofacial NF-1 should be global, comprehensive, and individualized.

  4. MR imaging of soft tissue alterations after total hip arthroplasty: comparison of classic surgical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Agten, Christoph A.; Sutter, Reto; Pfirrmann, Christian W.A. [Balgrist University Hospital, Radiology, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland); Dora, Claudio [Balgrist University Hospital, Orthopedic Surgery, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland)

    2017-03-15

    To compare soft-tissue changes after total hip arthroplasty with posterior, direct-lateral, anterolateral, or anterior surgical approaches. MRI of 120 patients after primary total hip arthroplasty (30 per approach) were included. Each MRI was assessed by two readers regarding identification of surgical access, fatty muscle atrophy (Goutallier classification), tendon quality (0 = normal, 1 = tendinopathy, 2 = partial tear, 3 = avulsion), and fluid collections. Readers were blinded to the surgical approach. Surgical access was correctly identified in all cases. The direct lateral approach showed highest Goutallier grades and tendon damage for gluteus minimus muscle (2.07-2.67 and 2.00-2.77; p = 0.017 and p = 0.001 for readers 1 and 2, respectively) and tendon (2.30/1.67; p < 0.0005 for reader 1/2), and the lateral portion of the gluteus medius tendon (2.77/2.20; p < 0.0005 for reader 1/2). The posterior approach showed highest Goutallier grades and tendon damage for external rotator muscles (1.97-2.67 and 1.57-2.40; p < 0.0005-0.006 for reader 1/2) and tendons (1.41-2.45 and 1.93-2.76; p < 0.0005 for reader 1/2). The anterolateral and anterior approach showed less soft tissue damage. Fluid collections showed no differences between the approaches. MRI is well suited to identify surgical approaches after THA. The anterior and anterolateral approach showed less soft tissue damage compared to the posterior and direct lateral approach. (orig.)

  5. Role of Soft-Tissue Heterogeneity in Computational Models of Deep Brain Stimulation.

    Science.gov (United States)

    Howell, Bryan; McIntyre, Cameron C

    Bioelectric field models of deep brain stimulation (DBS) are commonly utilized in research and industrial applications. However, the wide range of different representations used for the human head in these models may be responsible for substantial variance in the stimulation predictions. Determine the relative error of ignoring cerebral vasculature and soft-tissue heterogeneity outside of the brain in computational models of DBS. We used a detailed atlas of the human head, coupled to magnetic resonance imaging data, to construct a range of subthalamic DBS volume conductor models. We incrementally simplified the most detailed base model and quantified changes in the stimulation thresholds for direct activation of corticofugal axons. Ignoring cerebral vasculature altered predictions of stimulation thresholds by brain altered predictions between -44 % and 174%. Heterogeneity in the soft tissues of the head, if unaccounted for, introduces a degree of uncertainty in predicting electrical stimulation of neural elements that is not negligible and thereby warrants consideration in future modeling studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Computational approaches to homogeneous gold catalysis.

    Science.gov (United States)

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  7. Granular, soft and fuzzy approaches for intelligent systems dedicated to professor Ronald R. Yager

    CERN Document Server

    Filev, Dimitar; Beliakov, Gleb

    2017-01-01

    This book offers a comprehensive report on the state-of-the art in the broadly-intended field of “intelligent systems”. After introducing key theoretical issues, it describes a number of promising models for data and system analysis, decision making, and control. It discusses important theories, including possibility theory, the Dempster-Shafer theory, the theory of approximate reasoning, as well as computing with words, together with novel applications in various areas, such as information aggregation and fusion, linguistic data summarization, participatory learning, systems modeling, and many others. By presenting the methods in their application contexts, the book shows how granular computing, soft computing and fuzzy logic techniques can provide novel, efficient solutions to real-world problems. It is dedicated to Professor Ronald R. Yager for his great scientific and scholarly achievements, and for his long-lasting service to the fuzzy logic, and the artificial and computational intelligence communit...

  8. Design approach of soft control system for implementation of advanced MMI in KNGR

    International Nuclear Information System (INIS)

    Kim, J. K.; Choi, M. J.; Choe, I. N.

    1999-01-01

    To overcome the inherent inflexibility of spatially dedicated man-machine interface (MMI) in conventional control room, computer based MMI technologies, along with compact workstation concept, are adopted in KNGR control room target design. In order to achieve the compact workstation design, a large number of spatially dedicated control switches and manual/auto stations in a traditional control room have to be replaced by a few common multi-function devices. These control devices, so called Soft Control System, consist of a personal computer based Flat Panel Display (FPD) device with touch sensitive screen which provides control MMI for the component selected among a number of plant components. Soft Control System is MMI device to allow control of continuous and discrete control device from single panel device. Soft Control System allows a standard interface device to assume the role of numerous control switch and analog control devices via software configuration. This has the advantage of following operator access to all plant control from a single control compact workstation. (author)

  9. Soft Computing Methods for Microwave and Millimeter-Wave Design Problems

    CERN Document Server

    Chauhan, Narendra; Mittal, Ankush

    2012-01-01

    The growing commercial market of Microwave/ Millimeter wave industry over the past decade has led to the explosion of interests and opportunities for the design and development of microwave components.The design of most microwave components requires the use of commercially available electromagnetic (EM) simulation tools for their analysis. In the design process, the simulations are carried out by varying the design parameters until the desired response is obtained. The optimization of design parameters by manual searching is a cumbersome and time consuming process. Soft computing methods such as Genetic Algorithm (GA), Artificial Neural Network (ANN) and Fuzzy Logic (FL) have been widely used by EM researchers for microwave design since last decade. The aim of these methods is to tolerate imprecision, uncertainty, and approximation to achieve robust and low cost solution in a small time frame.  Modeling and optimization are essential parts and powerful tools for the microwave/millimeter wave design. This boo...

  10. Soft computing and its applications v.1 a unified engineering concept

    CERN Document Server

    Ray, Kumar S

    2014-01-01

    Notion of Soft Computing 1.1. Introduction1.2. Scope for future workFuzzy Sets, Fuzzy Operators and Fuzzy Relations 2.1. Introduction2.2. Fuzzy set2.3. Metrics for fuzzy numbers2.4. Difference in fuzzy set2.5. Distance in fuzzy set2.6. Cartesian product of fuzzy set2.7. Operators on fuzzy set2.8. Other operations in fuzzy set2.9. Geometric interpretation of fuzzy sets2.10. T-operators2.11. Aggregation operators2.12. Probability versus Possibility2.13. Fuzzy event2.14. Uncertainty2.15. Measure of fuzziness2.16. Type-2 fuzzy sets2.17. RelationFuzzy Logic3.1. Introduction3.2. Preliminaries of log

  11. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  12. Autologous Fat Grafting as a Novel Approach to Parastomal Soft-tissue Volume Deficiencies

    Directory of Open Access Journals (Sweden)

    Robert C. Wu, MD

    2014-03-01

    Full Text Available Summary: The aim of this study is to describe a novel approach to revise maladaptive soft-tissue contour around an ileostomy. A patient with permanent ileostomy suffered from significant defects in soft-tissue contour due to scarring and wound contraction. He underwent autologous fat grafting to achieve sealing of his stoma appliance and improve cosmesis. Due to numerous surgeries, the stoma appliance would not seal and required daily appliance changes. The patient received autologous fat grafting to augment the contour around stoma. A complete fitting of stoma was achieved. The patient is satisfied with stoma sealing and is changing his stoma appliance every 5–7 days without skin excoriation. Autologous fat transfer is an effective approach to treat a subset of stoma patients with complex subcutaneous defects.

  13. A novel approach for assesing macromolecular complexes combining soft-docking calculations with NMR data

    Science.gov (United States)

    Morelli, Xavier J.; Palma, P. Nuno; Guerlesquin, Françoise; Rigby, Alan C.

    2001-01-01

    We present a novel and efficient approach for assessing protein–protein complex formation, which combines ab initio docking calculations performed with the protein docking algorithm BiGGER and chemical shift perturbation data collected with heteronuclear single quantum coherence (HSQC) or TROSY nuclear magnetic resonance (NMR) spectroscopy. This method, termed "restrained soft-docking," is validated for several known protein complexes. These data demonstrate that restrained soft-docking extends the size limitations of NMR spectroscopy and provides an alternative method for investigating macromolecular protein complexes that requires less experimental time, effort, and resources. The potential utility of this novel NMR and simulated docking approach in current structural genomic initiatives is discussed. PMID:11567104

  14. CT evaluation of soft tissue and muscle infection and inflammation: A systematic compartmental approach

    International Nuclear Information System (INIS)

    Beauchamp, N.J. Jr.; Scott, W.W. Jr.; Gottlieb, L.M.; Fishman, E.K.

    1995-01-01

    This essay presents a systematic approach to the evaluation of soft tissue and muscle infection by defining the various pathologic processes and then illustrating them through a series of CT studies with corresponding schematic diagrams. The specific processes discussed are cellulitis, lymphangitis/lymphedema, necrotizing fascitis, myositis/myonecrosis, and abscess. Key points in the differential diagnosis of these entities are discussed and illustrated. The clinical management of the specific pathologic processes is also discussed. (orig./MG)

  15. Soft computing and metaheuristics: using knowledge and reasoning to control search and vice-versa

    Science.gov (United States)

    Bonissone, Piero P.

    2004-01-01

    Meta-heuristics are heuristic procedures used to tune, control, guide, allocate computational resources or reason about object-level problem solvers in order to improve their quality, performance, or efficiency. Offline meta-heuristics define the best structural and/or parametric configurations for the object-level model, while on-line heuristics generate run-time corrections for the behavior of the same object-level solvers. Soft Computing is a framework in which we encode domain knowledge to develop such meta-heuristics. We explore the use of meta-heuristics in three application areas: a) control; b) optimization; and c) classification. In the context of control problems, we describe the use of evolutionary algorithms to perform offline parametric tuning of fuzzy controllers, and the use of fuzzy supervisory controllers to perform on-line mode-selection and output interpolation. In the area of optimization, we illustrate the application of fuzzy controllers to manage the transition from exploration to exploitation of evolutionary algorithms that solve the optimization problem. In the context of discrete classification problems, we have leveraged evolutionary algorithms to tune knowledge-based classifiers and maximize their coverage and accuracy.

  16. Soft sensors with white- and black-box approaches for a wastewater treatment process

    Directory of Open Access Journals (Sweden)

    D. Zyngier

    2000-12-01

    Full Text Available The increasing degradation of water resources makes it necessary to monitor and control process variables that may disturb the environment, but which may be very difficult to measure directly, either because there are no physical sensors available, or because these are too expensive. In this work, two soft sensors are proposed for monitoring concentrations of nitrate (NO and ammonium (NH ions, and of carbonaceous matter (CM during nitrification of wastewater. One of them is based on reintegration of a process model to estimate NO and NH and on a feedforward neural network to estimate CM. The other estimator is based on Stacked Neural Networks (SNN, an approach that provides the predictor with robustness. After simulation, both soft sensors were implemented in an experimental unit using FIX MMI (Intellution, Inc automation software as an interface between the process and MATLAB 5.1 (The Mathworks Inc. software.

  17. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  18. Soft systems methodology as a systemic approach to nuclear safety management

    International Nuclear Information System (INIS)

    Vieira Neto, Antonio S.; Guilhen, Sabine N.; Rubin, Gerson A.; Caldeira Filho, Jose S.; Camargo, Iara M.C.

    2017-01-01

    Safety approach currently adopted by nuclear installations is built almost exclusively upon analytical methodologies based, mainly, on the belief that the properties of a system, such as its safety, are given by its constituent parts. This approach, however, does not properly address the complex dynamic interactions between technical, human and organizational factors occurring within and outside the organization. After the accident at Fukushima Daiichi nuclear power plant in March 2011, experts of the International Atomic Energy Agency (IAEA) recommended a systemic approach as a complementary perspective to nuclear safety. The aim of this paper is to present an overview of the systems thinking approach and its potential use for structuring socio technical problems involved in the safety of nuclear installations, highlighting the methodologies related to the soft systems thinking, in particular the Soft Systems Methodology (SSM). The implementation of a systemic approach may thus result in a more holistic picture of the system by the complex dynamic interactions between technical, human and organizational factors. (author)

  19. Soft systems methodology as a systemic approach to nuclear safety management

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Neto, Antonio S.; Guilhen, Sabine N.; Rubin, Gerson A.; Caldeira Filho, Jose S.; Camargo, Iara M.C., E-mail: asvneto@ipen.br, E-mail: snguilhen@ipen.br, E-mail: garubin@ipen.br, E-mail: jscaldeira@ipen.br, E-mail: icamargo@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNE-SP), Sao Paulo, SP (Brazil)

    2017-07-01

    Safety approach currently adopted by nuclear installations is built almost exclusively upon analytical methodologies based, mainly, on the belief that the properties of a system, such as its safety, are given by its constituent parts. This approach, however, does not properly address the complex dynamic interactions between technical, human and organizational factors occurring within and outside the organization. After the accident at Fukushima Daiichi nuclear power plant in March 2011, experts of the International Atomic Energy Agency (IAEA) recommended a systemic approach as a complementary perspective to nuclear safety. The aim of this paper is to present an overview of the systems thinking approach and its potential use for structuring socio technical problems involved in the safety of nuclear installations, highlighting the methodologies related to the soft systems thinking, in particular the Soft Systems Methodology (SSM). The implementation of a systemic approach may thus result in a more holistic picture of the system by the complex dynamic interactions between technical, human and organizational factors. (author)

  20. A multidisciplinary approach to giant soft tissue sarcoma of the chest wall: A case report.

    Science.gov (United States)

    Davis, Catherine H; Yammine, Halim; Khaitan, Puja G; Chan, Edward Y; Kim, Min P

    2016-01-01

    Soft tissue sarcomas of the chest wall are exceptionally rare entities that present as painless slow growing masses. Resection is often precarious due to involvement of vital structures, and patients are left with large chest wall defects postoperatively requiring extensive reconstruction. We present a case report of a 29 year-old man who presented with a giant soft tissue sarcoma of the chest that had been growing slowly for one year prior to presentation. The patient had a biopsy that was positive for sarcoma, and PET CT demonstrated a large lobulated mass in the left chest wall with an SUV of 6.7. He received 50Gy of radiation therapy; however, the mass continued to grow in size. He subsequently underwent an en-bloc resection of the mass with latissimus and serratus muscle primary reconstruction. Final pathology showed a 27cm high-grade fibrosarcoma with prominent myxoid component. To our knowledge, this is the largest soft tissue sarcoma of the chest wall reported in the literature. Postoperatively, the patient received 6 cycles of adjuvant chemotherapy. Surgery is the mainstay of treatment, and chemotherapy and radiation are used in specific circumstances. Risk of recurrence is dependent on many factors, including histologic subtype, grade, and size of tumor. Long term surveillance with physical exam and imaging is recommended. We feel that the multidisciplinary approach is crucial for optimal management of large soft tissue sarcomas. We recommend this approach to all patients with chest wall sarcomas. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Computational approach to compact Riemann surfaces

    Science.gov (United States)

    Frauendiener, Jörg; Klein, Christian

    2017-01-01

    A purely numerical approach to compact Riemann surfaces starting from plane algebraic curves is presented. The critical points of the algebraic curve are computed via a two-dimensional Newton iteration. The starting values for this iteration are obtained from the resultants with respect to both coordinates of the algebraic curve and a suitable pairing of their zeros. A set of generators of the fundamental group for the complement of these critical points in the complex plane is constructed from circles around these points and connecting lines obtained from a minimal spanning tree. The monodromies are computed by solving the defining equation of the algebraic curve on collocation points along these contours and by analytically continuing the roots. The collocation points are chosen to correspond to Chebychev collocation points for an ensuing Clenshaw-Curtis integration of the holomorphic differentials which gives the periods of the Riemann surface with spectral accuracy. At the singularities of the algebraic curve, Puiseux expansions computed by contour integration on the circles around the singularities are used to identify the holomorphic differentials. The Abel map is also computed with the Clenshaw-Curtis algorithm and contour integrals. As an application of the code, solutions to the Kadomtsev-Petviashvili equation are computed on non-hyperelliptic Riemann surfaces.

  2. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  3. Using soft-X-ray energy spectrum to measure electronic temperature Te and primary research with computer data processing

    International Nuclear Information System (INIS)

    Wang Jingyao; Zhang Guangyang

    1993-01-01

    The authors reported the application of SCORPIO--2000 Computer detecting system on a nuclear fusion equipment, to measure the energy spectrum of soft X-ray from which the plasma electronic temperature was calculated. The authors processed systematically the data of the energy area of 1-4 Kev soft X-ray. The program edited was mostly made in FORTRAN, but only one SUBSB was made in assembly language. The program worked normally with convincing operation and easy correction of the data. The result obtained from calculation is the same as what was expected and the diagram obtained is the same as the expected one

  4. An Evolutionary Approach to the Soft Error Mitigation Technique for Cell-Based Design

    Directory of Open Access Journals (Sweden)

    PARK, J. K.

    2015-02-01

    Full Text Available In this paper, we present a soft error mitigation algorithm that searches for the proper gate sizes within constrained gate-level designs. The individual gate sizing has an impact on the former optimization results and degrades the quality of the solution. In order to address this inefficiency, we utilize a modified topological sort that preserves the preceding local optima. Using a new local searcher, a hybrid genetic optimization technique for soft error mitigation is proposed. This evolutionary search algorithm has general genetic operators: the initialization of the population, crossover, mutation and selection operators. The local searcher consists of two subsequent heuristics. These search algorithms make the individual chromosome move to better search regions in a short time and then, the population acquires various candidates for the global optimum with the help of other genetic operators. The experiments show that the proposed genetic algorithm achieves an approximately 90% reduction in the number of soft errors when compared to the conventional greedy approach with at most 30% overhead for the area and critical path delay.

  5. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  6. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  7. COOBBO: A Novel Opposition-Based Soft Computing Algorithm for TSP Problems

    Directory of Open Access Journals (Sweden)

    Qingzheng Xu

    2014-12-01

    Full Text Available In this paper, we propose a novel definition of opposite path. Its core feature is that the sequence of candidate paths and the distances between adjacent nodes in the tour are considered simultaneously. In a sense, the candidate path and its corresponding opposite path have the same (or similar at least distance to the optimal path in the current population. Based on an accepted framework for employing opposition-based learning, Oppositional Biogeography-Based Optimization using the Current Optimum, called COOBBO algorithm, is introduced to solve traveling salesman problems. We demonstrate its performance on eight benchmark problems and compare it with other optimization algorithms. Simulation results illustrate that the excellent performance of our proposed algorithm is attributed to the distinct definition of opposite path. In addition, its great strength lies in exploitation for enhancing the solution accuracy, not exploration for improving the population diversity. Finally, by comparing different version of COOBBO, another conclusion is that each successful opposition-based soft computing algorithm needs to adjust and remain a good balance between backward adjacent node and forward adjacent node.

  8. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  9. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    Science.gov (United States)

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  10. Prediction of Ultimate Strain and Strength of FRP-Confined Concrete Cylinders Using Soft Computing Methods

    Directory of Open Access Journals (Sweden)

    Iman Mansouri

    2017-07-01

    Full Text Available This paper investigates the effectiveness of four different soft computing methods, namely radial basis neural network (RBNN, adaptive neuro fuzzy inference system (ANFIS with subtractive clustering (ANFIS-SC, ANFIS with fuzzy c-means clustering (ANFIS-FCM and M5 model tree (M5Tree, for predicting the ultimate strength and strain of concrete cylinders confined with fiber-reinforced polymer (FRP sheets. The models were compared according to the root mean square error (RMSE, mean absolute relative error (MARE and determination coefficient (R2 criteria. Similar accuracy was obtained by RBNN and ANFIS-FCM, and they provided better estimates in modeling ultimate strength of confined concrete. The ANFIS-SC, however, performed slightly better than the RBNN and ANFIS-FCM in estimating ultimate strain of confined concrete, and M5Tree provided the worst strength and strain estimates. Finally, the effects of strain ratio and the confinement stiffness ratio on strength and strain were investigated, and the confinement stiffness ratio was shown to be more effective.

  11. Application of Soft Computing Methods for the Estimation of Roadheader Performance from Schmidt Hammer Rebound Values

    Directory of Open Access Journals (Sweden)

    Hadi Fattahi

    2017-01-01

    Full Text Available Estimation of roadheader performance is one of the main topics in determining the economics of underground excavation projects. The poor performance estimation of roadheader scan leads to costly contractual claims. In this paper, the application of soft computing methods for data analysis called adaptive neuro-fuzzy inference system- subtractive clustering method (ANFIS-SCM and artificial  neural  network  (ANN optimized  by  hybrid  particle  swarm  optimization  and  genetic  algorithm  (HPSOGA to estimate roadheader performance is demonstrated. The data to show the applicability of these methods were collected from tunnels for Istanbul’s sewage system, Turkey. Two estimation models based on ANFIS-SCM and ANN-HPSOGA were developed. In these models, Schmidt hammer rebound values and rock quality designation (RQD were utilized as the input parameters, and net cutting rates constituted the output parameter. Various statistical performance indices were used to compare the performance of those estimation models. The results indicated that the ANFIS-SCM model has strong potentials to estimate roadheader performance with high degrees of accuracy and robustness.

  12. Monitoring the Microgravity Environment Quality On-Board the International Space Station Using Soft Computing Techniques

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.

    2001-01-01

    This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  13. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    Directory of Open Access Journals (Sweden)

    Lukas Falat

    2016-01-01

    Full Text Available This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  14. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij

    2012-01-01

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  15. Tissue-specific endothelial cells: a promising approach for augmentation of soft tissue repair in orthopedics.

    Science.gov (United States)

    Lebaschi, Amir; Nakagawa, Yusuke; Wada, Susumu; Cong, Guang-Ting; Rodeo, Scott A

    2017-12-01

    Biologics are playing an increasingly significant role in the practice of modern medicine and surgery in general and orthopedics in particular. Cell-based approaches are among the most important and widely used modalities in orthopedic biologics, with mesenchymal stem cells and other multi/pluripotent cells undergoing evaluation in numerous preclinical and clinical studies. On the other hand, fully differentiated endothelial cells (ECs) have been found to perform critical roles in homeostasis of visceral tissues through production of an adaptive panel of so-called "angiocrine factors." This newly discovered function of ECs renders them excellent candidates for novel approaches in cell-based biologics. Here, we present a review of the role of ECs and angiocrine factors in some visceral tissues, followed by an overview of current cell-based approaches and a discussion of the potential applications of ECs in soft tissue repair. © 2017 New York Academy of Sciences.

  16. Computer aided synthesis: a game theoretic approach

    OpenAIRE

    Bruyère, Véronique

    2017-01-01

    In this invited contribution, we propose a comprehensive introduction to game theory applied in computer aided synthesis. In this context, we give some classical results on two-player zero-sum games and then on multi-player non zero-sum games. The simple case of one-player games is strongly related to automata theory on infinite words. All along the article, we focus on general approaches to solve the studied problems, and we provide several illustrative examples as well as intuitions on the ...

  17. Observing Distributed Computation. A Dynamic-Epistemic Approach

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian

    2007-01-01

    R. Mardare. Observing Distributed Computation. A Dynamic-Epistemic Approach. In Proc. of the second Conference on Algebra and Coalgebra in Computer Science (CALCO2007), Lecture Notes in Computer Science 4624:379-393, Springer, 2007......R. Mardare. Observing Distributed Computation. A Dynamic-Epistemic Approach. In Proc. of the second Conference on Algebra and Coalgebra in Computer Science (CALCO2007), Lecture Notes in Computer Science 4624:379-393, Springer, 2007...

  18. A Dynamic Approach to the Analysis of Soft Power in International Relations

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2013-12-01

    Full Text Available This article discusses soft power in international relations and the soft power of China’s foreign policy in recent years. After presenting a critique of the soft power theory developed by Joseph S. Nye, the paper provides an alternative interpretation of soft power. The author proposes a dynamic analysis of soft power in international relations, and argues that whether a power resource is soft or hard depends on the perceptions and feelings of various actors in specific situations. Due to the varying degrees of acceptance, power can be divided into hard power, soft power and bargaining power. An analysis should look at the soft or hard effectiveness of a power resource from three perspectives–horizontally, vertically and relatively. Recently, the soft power of China’s foreign policy and international behavior has mainly been manifested in multilateralism, economic diplomacy and a good-neighborly policy.

  19. A staged approach of implant placement in immediate extraction sockets for preservation of peri-implant soft and hard tissue

    OpenAIRE

    Vinnakota, Dileep Nag; Akula, Sreenivasa Rao; Krishna Reddy, V. Vamsi; Sankar, V. Vijay

    2014-01-01

    Esthetic zone restoration is a challenging aspect in implant dentistry because of two critical factors such as level of bone support and soft tissue dimensions. Preservation of healthy peri-implant tissues is of primary importance for ensuring better esthetics over an extended period. The aim of the present case-series was to evaluate a new staged approach of implant placement in immediate extraction sockets for preservation of peri-implant soft and hard tissues. Four subjects scheduled for e...

  20. Impact of Computed Tomography Image Quality on Image-Guided Radiation Therapy Based on Soft Tissue Registration

    International Nuclear Information System (INIS)

    Morrow, Natalya V.; Lawton, Colleen A.; Qi, X. Sharon; Li, X. Allen

    2012-01-01

    Purpose: In image-guided radiation therapy (IGRT), different computed tomography (CT) modalities with varying image quality are being used to correct for interfractional variations in patient set-up and anatomy changes, thereby reducing clinical target volume to the planning target volume (CTV-to-PTV) margins. We explore how CT image quality affects patient repositioning and CTV-to-PTV margins in soft tissue registration-based IGRT for prostate cancer patients. Methods and Materials: Four CT-based IGRT modalities used for prostate RT were considered in this study: MV fan beam CT (MVFBCT) (Tomotherapy), MV cone beam CT (MVCBCT) (MVision; Siemens), kV fan beam CT (kVFBCT) (CTVision, Siemens), and kV cone beam CT (kVCBCT) (Synergy; Elekta). Daily shifts were determined by manual registration to achieve the best soft tissue agreement. Effect of image quality on patient repositioning was determined by statistical analysis of daily shifts for 136 patients (34 per modality). Inter- and intraobserver variability of soft tissue registration was evaluated based on the registration of a representative scan for each CT modality with its corresponding planning scan. Results: Superior image quality with the kVFBCT resulted in reduced uncertainty in soft tissue registration during IGRT compared with other image modalities for IGRT. The largest interobserver variations of soft tissue registration were 1.1 mm, 2.5 mm, 2.6 mm, and 3.2 mm for kVFBCT, kVCBCT, MVFBCT, and MVCBCT, respectively. Conclusions: Image quality adversely affects the reproducibility of soft tissue-based registration for IGRT and necessitates a careful consideration of residual uncertainties in determining different CTV-to-PTV margins for IGRT using different image modalities.

  1. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  2. Computation within the auxiliary field approach

    CERN Document Server

    Baeurle, S A

    2003-01-01

    Recently, the classical auxiliary field methodology has been developed as a new simulation technique for performing calculations within the framework of classical statistical mechanics. Since the approach suffers from a sign problem, a judicious choice of the sampling algorithm, allowing a fast statistical convergence and an efficient generation of field configurations, is of fundamental importance for a successful simulation. In this paper we focus on the computational aspects of this simulation methodology. We introduce two different types of algorithms, the single-move auxiliary field Metropolis Monte Carlo algorithm and two new classes of force-based algorithms, which enable multiple-move propagation. In addition, to further optimize the sampling, we describe a preconditioning scheme, which permits to treat each field degree of freedom individually with regard to the evolution through the auxiliary field configuration space. Finally, we demonstrate the validity and assess the competitiveness of these algo...

  3. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  4. SoftAR: visually manipulating haptic softness perception in spatial augmented reality.

    Science.gov (United States)

    Punpongsanon, Parinya; Iwai, Daisuke; Sato, Kosuke

    2015-11-01

    We present SoftAR, a novel spatial augmented reality (AR) technique based on a pseudo-haptics mechanism that visually manipulates the sense of softness perceived by a user pushing a soft physical object. Considering the limitations of projection-based approaches that change only the surface appearance of a physical object, we propose two projection visual effects, i.e., surface deformation effect (SDE) and body appearance effect (BAE), on the basis of the observations of humans pushing physical objects. The SDE visualizes a two-dimensional deformation of the object surface with a controlled softness parameter, and BAE changes the color of the pushing hand. Through psychophysical experiments, we confirm that the SDE can manipulate softness perception such that the participant perceives significantly greater softness than the actual softness. Furthermore, fBAE, in which BAE is applied only for the finger area, significantly enhances manipulation of the perception of softness. We create a computational model that estimates perceived softness when SDE+fBAE is applied. We construct a prototype SoftAR system in which two application frameworks are implemented. The softness adjustment allows a user to adjust the softness parameter of a physical object, and the softness transfer allows the user to replace the softness with that of another object.

  5. Use of computed tomographic densitometry to quantify contrast enhancement of compressive soft tissues in the canine lumbosacral vertebral canal.

    Science.gov (United States)

    Jones, Jeryl C; Shires, Peter K; Inzana, Karen D; Mosby, Adina D; Sponenberg, D Philip; Lanz, Otto I

    2002-05-01

    To evaluate computed tomography (CT) densitometry as a technique for quantifying contrast enhancement of compressive soft tissues in the canine lumbosacral vertebral canal and to determine whether the degree of contrast enhancement can be used to help predict tissue type or histopathologic characteristics. 29 large breed dogs with lumbosacral stenosis. Contrast-enhanced CT of L5-S3 was performed by use of a previously described protocol. At each disk level, CT densities of a water-filled syringe, epaxial muscles, and 4 vertebral canal locations were measured. Mean tissue enhancement was calculated by vertebral canal location, using water-filled syringe enhancement as a correction factor. Corrected CT enhancement was compared with tissue type, degree of tissue inflammation, and degree of tissue activity. Intravenous contrast administration of contrast medium significantly increased CT densities of water-filled syringes and epaxial muscles. Corrected CT enhancement of vertebral canal soft tissues at stenotic sites was greater than at nonstenotic sites. There was no association between enhancement and tissue type for any vertebral canal location. There was no correlation between enhancement and degree of tissue inflammation. There was a correlation between enhancement and tissue activity in the dorsal vertebral canal only. A water-filled syringe is a useful calibration tool for CT density measurements. The degree of tissue contrast enhancement, measured by CT densitometry, can be helpful for predicting the location of compressive soft tissues in dogs with lumbosacral stenosis. However, it is of limited value for predicting compressive soft-tissue types or histopathologic characteristics.

  6. Computed tomography dose assessment - a practical approach

    International Nuclear Information System (INIS)

    Leitz, W.; Szendro, G.; Axelsson, B.

    1995-01-01

    A survey of the pattern and frequency of computed tomography (CT) examinations in Sweden has been conducted covering 89 of the 90 existing CT scanners (in 1991). The radiation output and the absorbed doses in phantoms have been measured for all types of CT scanners. For the assessment of the effective dose to the patient a new, practical approach has been developed. The average absorbed doses measured in cylindrical PMMA phantoms were assumed to be valid also for patients, those in the 320 mm diameter phantom for the trunk region and those in the 160 mm diameter phantom for the neck and head. With guidance from the ICRP 60 concept of tissue weighting factors, average weighting factors were adopted for the trunk, the neck, and the head region. Effective patient doses were calculated using three factors for doses measured in phantoms with the settings of exposure parameters as recorded in the survey. The results were compared with dose evaluations based on Monte Carlo calculations. The agreement was found to be satisfactory. It is suggested that this new practical approach should be adopted as a standard method for the assessment of effective dose in CT practices thus enabling direct access to dose evaluations in daily clinical practice -a prerequisite for the implementation of radiation protection concepts into the radiological society. (Author)

  7. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  8. Description of EMX computer code. System for measuring soft X rays

    International Nuclear Information System (INIS)

    Marty, D.A.; Smeulders, P.; Launois, D.

    1978-07-01

    After briefly describing the system for measuring soft X rays implanted in TFR 600, the objectives and principles of the E.M.X calculation programme are presented. This model is divided into two distinct parts. The ultimate aim of EMX 1, the first part, is to build the soft X ray photo of a plasma with varied characteristics, seen through a certain collimation system (in this case a slit). That of EMX 2, the second part, is to filter the previously built soft X ray photo, by means of the system of absorbents belonging to the measuring system and to calculate the currents generated by each detector aimed at a plasma chord. The first calculation results are commented and discussed [fr

  9. Comparing a soft and a hard multi-methodology approach: Location of an IT company in the Øresund Region

    DEFF Research Database (Denmark)

    Jeppesen, Sara Lise; Barfod, Michael Bruhn; Leleur, Steen

    2007-01-01

    system applied in the STMØ project (2005-2007). The soft approach explores litmus test, CSH, SSM and SWOT analysis, while the hard method approach combines preference analysis, CBA, AHP, SMARTER, MCA and COSIMA. Both approaches are supported by simple soft methods as stakeholder analysis...

  10. A compositional approach to building applications in a computational environment

    International Nuclear Information System (INIS)

    Roslovtsev, V V; Shumsky, L D; Wolfengagen, V E

    2014-01-01

    The paper presents an approach to creating an applicative computational environment to feature computational processes and data decomposition, and a compositional approach to application building. The approach in question is based on the notion of combinator – both in systems with variable binding (such as λ-calculi) and those allowing programming without variables (combinatory logic style). We present a computation decomposition technique based on objects' structural decomposition, with the focus on computation decomposition. The computational environment's architecture is based on a network with nodes playing several roles simultaneously.

  11. Percutaneous computed tomography-guided core needle biopsy of soft tissue tumors: results and correlation with surgical specimen analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Grigio, Henrique Ramos; Bitencourt, Almir Galvao Vieira; Pinto, Paula Nicole Vieira; Tyng, Chiang J.; Cunha, Isabela Werneck da; Aguiar Junior, Samuel; Lopes, Ademar, E-mail: chojniak@uol.com.br [Hospital A.C. Camargo, Sao Paulo, SP (Brazil)

    2012-09-15

    Objective: To evaluate the efficacy of percutaneous computed tomography (CT)-guided core needle biopsy of soft tissue tumors in obtaining appropriate samples for histological analysis, and compare its diagnosis with the results of the surgical pathology as available. Materials and Methods: The authors reviewed medical records, imaging and histological reports of 262 patients with soft-tissue tumors submitted to CT-guided core needle biopsy in an oncologic reference center between 2003 and 2009. Results: Appropriate samples were obtained in 215 (82.1%) out of the 262 patients. The most prevalent tumors were sarcomas (38.6%), metastatic carcinomas (28.8%), benign mesenchymal tumors (20.5%) and lymphomas (9.3%). Histological grading was feasible in 92.8% of sarcoma patients, with the majority of them (77.9%) being classified as high grade tumors. Out of the total sample, 116 patients (44.3%) underwent surgical excision and diagnosis confirmation. Core biopsy demonstrated 94.6% accuracy in the identification of sarcomas, with 96.4% sensitivity and 89.5% specificity. A significant intermethod agreement about histological grading was observed between core biopsy and surgical resection (p < 0.001; kappa = 0.75). Conclusion: CT-guided core needle biopsy demonstrated a high diagnostic accuracy in the evaluation of soft tissue tumors as well as in the histological grading of sarcomas, allowing an appropriate therapeutic planning (author)

  12. Soft brain-machine interfaces for assistive robotics: A novel control approach.

    Science.gov (United States)

    Schiatti, Lucia; Tessadori, Jacopo; Barresi, Giacinto; Mattos, Leonardo S; Ajoudani, Arash

    2017-07-01

    Robotic systems offer the possibility of improving the life quality of people with severe motor disabilities, enhancing the individual's degree of independence and interaction with the external environment. In this direction, the operator's residual functions must be exploited for the control of the robot movements and the underlying dynamic interaction through intuitive and effective human-robot interfaces. Towards this end, this work aims at exploring the potential of a novel Soft Brain-Machine Interface (BMI), suitable for dynamic execution of remote manipulation tasks for a wide range of patients. The interface is composed of an eye-tracking system, for an intuitive and reliable control of a robotic arm system's trajectories, and a Brain-Computer Interface (BCI) unit, for the control of the robot Cartesian stiffness, which determines the interaction forces between the robot and environment. The latter control is achieved by estimating in real-time a unidimensional index from user's electroencephalographic (EEG) signals, which provides the probability of a neutral or active state. This estimated state is then translated into a stiffness value for the robotic arm, allowing a reliable modulation of the robot's impedance. A preliminary evaluation of this hybrid interface concept provided evidence on the effective execution of tasks with dynamic uncertainties, demonstrating the great potential of this control method in BMI applications for self-service and clinical care.

  13. Evaluation of the Respimat Soft Mist Inhaler using a concurrent CFD and in vitro approach.

    Science.gov (United States)

    Worth Longest, P; Hindle, Michael

    2009-06-01

    The Respimat Soft Mist Inhaler is reported to generate an aerosol with low spray momentum and a small droplet size. However, the transport characteristics of the Respimat aerosol are not well understood. The objective of this study was to characterize the transport and deposition of an aerosol emitted from the Respimat inhaler using a combination of computational fluid dynamics (CFD) modeling and in vitro experiments. Deposition of the Respimat aerosol was assessed in the inhaler mouthpiece (MP), a standard induction port (IP), and a more realistic mouth-throat (MT) geometry at an inhalation flow rate of 30 L/min. Aerosols were generated using an albuterol sulfate (0.6%) solution, and the drug deposition was quantified using both in vitro experiments and a CFD model of the Respimat inhaler. Laser diffraction experiments were used to determine the initial polydisperse aerosol size distribution. It was found that the aerosol generated from the highly complex process of jet collision and breakup could be approximated in the model using effective spray conditions. Computational predictions of deposition fractions agreed well with in vitro results for both the IP (within 20% error) and MT (within 10% error) geometries. The experimental results indicated that the deposition fraction of drug in the MP ranged from 27 to 29% and accounted for a majority of total drug loss. Based on the CFD solution, high MP deposition was due to a recirculating flow pattern that surrounded the aerosol spray and entrained a significant number of small droplets. In contrast, deposition of the Respimat aerosol in both the IP (4.2%) and MT (7.4%) geometries was relatively low. Results of this study indicate that modifications to the current Respimat MP and control of specific patient variables may significantly reduce deposition in the device and may decrease high oropharyngeal drug loss observed in vivo.

  14. Mobile computing: an approach towards paperless office ...

    African Journals Online (AJOL)

    Prospective users are now aware of the range of critical solutions offered by mobile computing paradigm, the application executing both on the mobile device and synchronizing at the central repository (database) situated at the organization stationery boundaries. Keywords: Mobile computing, Paperless office, Paper, ...

  15. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach

    Science.gov (United States)

    2012-08-01

    generated soft data, such as HUMINT (HUMan 1 2 INTelligence), OSINT (Open Source INTelligence) and COMINT (COMmunications INTelligence), are fundamentally...human intelligence (HUMINT), open source intelligence ( OSINT ), and communications intelligence (COMINT), which is human communications derived from...respectively). The sources correspond to selected intelligence disciplines described in [9]. HUMINT and OSINT sources provide mostly soft

  16. ANFIS, SVM and ANN soft-computing techniques to estimate daily global solar radiation in a warm sub-humid environment

    Science.gov (United States)

    Quej, Victor H.; Almorox, Javier; Arnaldo, Javier A.; Saito, Laurel

    2017-03-01

    Daily solar radiation is an important variable in many models. In this paper, the accuracy and performance of three soft computing techniques (i.e., adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and support vector machine (SVM) were assessed for predicting daily horizontal global solar radiation from measured meteorological variables in the Yucatán Peninsula, México. Model performance was assessed with statistical indicators such as root mean squared error (RMSE), mean absolute error (MAE) and coefficient of determination (R2). The performance assessment indicates that the SVM technique with requirements of daily maximum and minimum air temperature, extraterrestrial solar radiation and rainfall has better performance than the other techniques and may be a promising alternative to the usual approaches for predicting solar radiation.

  17. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  18. COGNITIVE COMPUTER GRAPHICS AS A MEANS OF "SOFT" MODELING IN PROBLEMS OF RESTORATION OF FUNCTIONS OF TWO VARIABLES

    Directory of Open Access Journals (Sweden)

    A.N. Khomchenko

    2016-08-01

    Full Text Available The paper considers the problem of bi-cubic interpolation on the final element of serendipity family. With cognitive-graphical analysis the rigid model of Ergatoudis, Irons and Zenkevich (1968 compared with alternative models, obtained by the methods: direct geometric design, a weighted averaging of the basis polynomials, systematic generation of bases (advanced Taylor procedure. The emphasis is placed on the phenomenon of "gravitational repulsion" (Zenkevich paradox. The causes of rising of inadequate physical spectra nodal loads on serendipity elements of higher orders are investigated. Soft modeling allows us to build a lot of serendipity elements of bicubic interpolation, and you do not even need to know the exact form of the rigid model. The different interpretations of integral characteristics of the basis polynomials: geometrical, physical, probability are offered. Under the soft model in the theory of interpolation of function of two variables implies the model amenable to change through the choice of basis. Such changes in the family of Lagrangian finite elements of higher orders are excluded (hard simulation. Standard models of serendipity family (Zenkevich were also tough. It was found that the "responsibility" for the rigidity of serendipity model rests on ruled surfaces (zero Gaussian curvature - conoids that predominate in the base set. Cognitive portraits zero lines of standard serendipity surfaces suggested that in order to "mitigate" of serendipity pattern conoid should better be replaced by surfaces of alternating Gaussian curvature. The article shows the alternative (soft bases of serendipity models. The work is devoted to solving scientific and technological problems aimed at the creation, dissemination and use of cognitive computer graphics in teaching and learning. The results are of interest to students of specialties: "Computer Science and Information Technologies", "System Analysis", "Software Engineering", as well as

  19. Multifractal Analysis of Seismically Induced Soft-Sediment Deformation Structures Imaged by X-Ray Computed Tomography

    Science.gov (United States)

    Nakashima, Yoshito; Komatsubara, Junko

    Unconsolidated soft sediments deform and mix complexly by seismically induced fluidization. Such geological soft-sediment deformation structures (SSDSs) recorded in boring cores were imaged by X-ray computed tomography (CT), which enables visualization of the inhomogeneous spatial distribution of iron-bearing mineral grains as strong X-ray absorbers in the deformed strata. Multifractal analysis was applied to the two-dimensional (2D) CT images with various degrees of deformation and mixing. The results show that the distribution of the iron-bearing mineral grains is multifractal for less deformed/mixed strata and almost monofractal for fully mixed (i.e. almost homogenized) strata. Computer simulations of deformation of real and synthetic digital images were performed using the egg-beater flow model. The simulations successfully reproduced the transformation from the multifractal spectra into almost monofractal spectra (i.e. almost convergence on a single point) with an increase in deformation/mixing intensity. The present study demonstrates that multifractal analysis coupled with X-ray CT and the mixing flow model is useful to quantify the complexity of seismically induced SSDSs, standing as a novel method for the evaluation of cores for seismic risk assessment.

  20. Computer science approach to quantum control

    International Nuclear Information System (INIS)

    Janzing, D.

    2006-01-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  1. Tear film evaluation and management in soft contact lens wear: a systematic approach.

    Science.gov (United States)

    Downie, Laura E; Craig, Jennifer P

    2017-09-01

    The human tear film is a highly ordered structure consisting of a thin layer of lipid on the surface and a thicker aqueous-mucin phase, which increases in mucin concentration toward the corneal epithelial cell layer. The health of the tear film and ocular surface influences the likelihood of being able to achieve successful contact lens wear. Contact lens discomfort and dryness are the most frequent reasons why contact lens wearers experience reduced wearing times, which can eventually lead to contact lens discontinuation. Comprehensive clinical assessment of tear film integrity and ocular surface health is therefore essential prior to commencing contact lens wear, to enable the ocular surface environment to be optimised to support lens wear. These parameters should also be evaluated over the course of contact lens wear, in order to identify any aspects requiring clinical management and ensure maintenance of optimal lens-wearing conditions. This review summarises current knowledge relating to the effects of soft contact lens wear on the tear film and ocular surface. It also provides a systematic approach to evaluating tear film and ocular surface integrity, in order to guide the clinical management of tear film anomalies with respect to contact lens wear. © 2017 Optometry Australia.

  2. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  3. Computer-aided design of nanostructures from self- and directed-assembly of soft matter building blocks

    Science.gov (United States)

    Nguyen, Trung Dac

    2011-12-01

    Functional materials that are active at nanometer scales and adaptive to environment have been highly desirable for a huge array of novel applications ranging from photonics, sensing, fuel cells, smart materials to drug delivery and miniature robots. These bio-inspired features imply that the underlying structure of this type of materials should possess a well-defined ordering as well as the ability to reconfigure in response to a given external stimulus such as temperature, electric field, pH or light. In this thesis, we employ computer simulation as a design tool, demonstrating that various ordered and reconfigurable structures can be obtained from the self- and directed-assembly of soft matter nano-building blocks such as nanoparticles, polymer-tethered nanoparticles and colloidal particles. We show that, besides thermodynamic parameters, the self-assembly of these building blocks is governed by nanoparticle geometry, the number and attachment location of tethers, solvent selectivity, balance between attractive and repulsive forces, nanoparticle size polydispersity, and field strength. We demonstrate that higher-order nanostructures, i.e. those for which the correlation length is much greater than the length scale of individual assembling building blocks, can be hierarchically assembled. For instance, bilayer sheets formed by laterally tethered rods fold into spiral scrolls and helical structures, which are able to adopt different morphologies depending on the environmental condition. We find that a square grid structure formed by laterally tethered nanorods can be transformed into a bilayer sheet structure, and vice versa, upon shortening, or lengthening, the rod segments, respectively. From these inspiring results, we propose a general scheme by which shape-shifting particles are employed to induce the reconfiguration of pre-assembled structures. Finally, we investigate the role of an external field in assisting the formation of assembled structures that would

  4. Imaging of musculoskeletal soft tissue infections

    International Nuclear Information System (INIS)

    Turecki, Marcin B.; Taljanovic, Mihra S.; Holden, Dean A.; Hunter, Tim B.; Rogers, Lee F.; Stubbs, Alana Y.; Graham, Anna R.

    2010-01-01

    Prompt and appropriate imaging work-up of the various musculoskeletal soft tissue infections aids early diagnosis and treatment and decreases the risk of complications resulting from misdiagnosis or delayed diagnosis. The signs and symptoms of musculoskeletal soft tissue infections can be nonspecific, making it clinically difficult to distinguish between disease processes and the extent of disease. Magnetic resonance imaging (MRI) is the imaging modality of choice in the evaluation of soft tissue infections. Computed tomography (CT), ultrasound, radiography and nuclear medicine studies are considered ancillary. This manuscript illustrates representative images of superficial and deep soft tissue infections such as infectious cellulitis, superficial and deep fasciitis, including the necrotizing fasciitis, pyomyositis/soft tissue abscess, septic bursitis and tenosynovitis on different imaging modalities, with emphasis on MRI. Typical histopathologic findings of soft tissue infections are also presented. The imaging approach described in the manuscript is based on relevant literature and authors' personal experience and everyday practice. (orig.)

  5. Imaging of musculoskeletal soft tissue infections

    Energy Technology Data Exchange (ETDEWEB)

    Turecki, Marcin B.; Taljanovic, Mihra S.; Holden, Dean A.; Hunter, Tim B.; Rogers, Lee F. [University of Arizona HSC, Department of Radiology, Tucson, AZ (United States); Stubbs, Alana Y. [Southern Arizona VA Health Care System, Department of Radiology, Tucson, AZ (United States); Graham, Anna R. [University of Arizona HSC, Department of Pathology, Tucson, AZ (United States)

    2010-10-15

    Prompt and appropriate imaging work-up of the various musculoskeletal soft tissue infections aids early diagnosis and treatment and decreases the risk of complications resulting from misdiagnosis or delayed diagnosis. The signs and symptoms of musculoskeletal soft tissue infections can be nonspecific, making it clinically difficult to distinguish between disease processes and the extent of disease. Magnetic resonance imaging (MRI) is the imaging modality of choice in the evaluation of soft tissue infections. Computed tomography (CT), ultrasound, radiography and nuclear medicine studies are considered ancillary. This manuscript illustrates representative images of superficial and deep soft tissue infections such as infectious cellulitis, superficial and deep fasciitis, including the necrotizing fasciitis, pyomyositis/soft tissue abscess, septic bursitis and tenosynovitis on different imaging modalities, with emphasis on MRI. Typical histopathologic findings of soft tissue infections are also presented. The imaging approach described in the manuscript is based on relevant literature and authors' personal experience and everyday practice. (orig.)

  6. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  7. Computational Approaches to Chemical Hazard Assessment

    Science.gov (United States)

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  8. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  9. Soft SUSY breaking terms in stringy scenarios computation and phenomenological viability

    CERN Document Server

    De Carlos, B; Muñoz, C

    1993-01-01

    We calculate the soft SUSY breaking terms arising from a large class of string scenarios, namely symmetric orbifold constructions, and study its phenomenological viability. They exhibit a certain lack of universality, unlike the usual assumptions of the minimal supersymmetric standard model. Assuming gaugino condensation in the hidden sector as the source of SUSY breaking, it turns out that squark and slepton masses tend to be much larger than gaugino masses. Furthermore, we show that these soft breaking terms can be perfectly consistent with both experimental and naturalness constraints (the latter comes from the absence of fine tuning in the $SU(2)\\times U(1)_Y\\rightarrow U(1)_{em}$ breaking process). This is certainly non--trivial and in fact imposes interesting constraints on measurable quantities. More precisely, we find that the gluino mass ($M_3$) and the chargino mass ($M_{\\chi^{\\pm}}$) cannot be much higher than their present experimental lower bounds ($M_3\\stackrel{}{{}_\\sim} 1\\ $TeV). This can be c...

  10. Computational and mathematical approaches to societal transitions

    NARCIS (Netherlands)

    J.S. Timmermans (Jos); F. Squazzoni (Flaminio); J. de Haan (Hans)

    2008-01-01

    textabstractAfter an introduction of the theoretical framework and concepts of transition studies, this article gives an overview of how structural change in social systems has been studied from various disciplinary perspectives. This overview first leads to the conclusion that computational and

  11. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  12. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    WINTEC

    Abstract. We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic opera- tions.

  13. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate ...

  14. Using soft computing techniques to predict corrected air permeability using Thomeer parameters, air porosity and grain density

    Science.gov (United States)

    Nooruddin, Hasan A.; Anifowose, Fatai; Abdulraheem, Abdulazeez

    2014-03-01

    Soft computing techniques are recently becoming very popular in the oil industry. A number of computational intelligence-based predictive methods have been widely applied in the industry with high prediction capabilities. Some of the popular methods include feed-forward neural networks, radial basis function network, generalized regression neural network, functional networks, support vector regression and adaptive network fuzzy inference system. A comparative study among most popular soft computing techniques is presented using a large dataset published in literature describing multimodal pore systems in the Arab D formation. The inputs to the models are air porosity, grain density, and Thomeer parameters obtained using mercury injection capillary pressure profiles. Corrected air permeability is the target variable. Applying developed permeability models in recent reservoir characterization workflow ensures consistency between micro and macro scale information represented mainly by Thomeer parameters and absolute permeability. The dataset was divided into two parts with 80% of data used for training and 20% for testing. The target permeability variable was transformed to the logarithmic scale as a pre-processing step and to show better correlations with the input variables. Statistical and graphical analysis of the results including permeability cross-plots and detailed error measures were created. In general, the comparative study showed very close results among the developed models. The feed-forward neural network permeability model showed the lowest average relative error, average absolute relative error, standard deviations of error and root means squares making it the best model for such problems. Adaptive network fuzzy inference system also showed very good results.

  15. A rigorous computational approach to linear response

    Science.gov (United States)

    Bahsoun, Wael; Galatolo, Stefano; Nisoli, Isaia; Niu, Xiaolong

    2018-03-01

    We present a general setting in which the formula describing the linear response of the physical measure of a perturbed system can be obtained. In this general setting we obtain an algorithm to rigorously compute the linear response. We apply our results to expanding circle maps. In particular, we present examples where we compute, up to a pre-specified error in the L∞ -norm, the response of expanding circle maps under stochastic and deterministic perturbations. Moreover, we present an example where we compute, up to a pre-specified error in the L 1-norm, the response of the intermittent family at the boundary; i.e. when the unperturbed system is the doubling map. This work was mainly conducted during a visit of SG to Loughborough University. WB and SG would like to thank The Leverhulme Trust for supporting mutual research visits through the Network Grant IN-2014-021. SG thanks the Department of Mathematical Sciences at Loughborough University for hospitality. WB thanks Dipartimento di Matematica, Universita di Pisa. The research of SG and IN is partially supported by EU Marie-Curie IRSES ‘Brazilian-European partnership in Dynamical Systems’ (FP7-PEOPLE-2012-IRSES 318999 BREUDS). IN was partially supported by CNPq and FAPERJ. IN would like to thank the Department of Mathematics at Uppsala University and the support of the KAW grant 2013.0315.

  16. Material parameter identification and inverse problems in soft tissue biomechanics

    CERN Document Server

    Evans, Sam

    2017-01-01

    The articles in this book review hybrid experimental-computational methods applied to soft tissues which have been developed by worldwide specialists in the field. People developing computational models of soft tissues and organs will find solutions for calibrating the material parameters of their models; people performing tests on soft tissues will learn what to extract from the data and how to use these data for their models and people worried about the complexity of the biomechanical behavior of soft tissues will find relevant approaches to address this complexity.

  17. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    Image registration using template matching is an important step in image processing. In this paper, a simple, robust and computationally efficient approach is presented. The proposed approach is based on the properties of a normalized covariance matrix. The main advantage of the proposed approach is that the image ...

  18. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  19. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  20. Panel discussion: Innovative approaches to high performance computing

    International Nuclear Information System (INIS)

    Bhanot, G.; Gottlieb, S.; Gupta, R.; Okawa, M.; Rapuano, F.; Mawhinney, R.

    2001-01-01

    A large part of research in lattice field theory is carried out via computer simulations. Some research groups use computer clusters readily assembled using off-the-shelf components, while others have been developing dedicated closely coupled massively parallel supercomputers. Pros and cons of these approaches, in particular the affordability and performance of these computers, were discussed. All the options being explored have different specific uses, and it is a good sign for the future that the computer industry is now taking active interest in building special purpose high performance computers

  1. An Integrated Computational Approach to Binding Theory

    OpenAIRE

    Bonato, Roberto

    2006-01-01

    thèse en cotutelle franco italienne; bourse régionale de la Vénétie; Pronouns play a decisive role in every natural language as the linguistic ele- ments that enable semantic cohesion of a text. Anaphora resolution (that is, the task of recovering in an automatic way the semantic content of pronouns) is therefore both an important theoretical issue and a ma jor technological challenge for any computer application that aims at a ?ner-grained semantic analysis of natural language texts. Binding...

  2. A Biogeotechnical approach to Stabilize Soft Marine Soil with a Microbial Organic Material called Biopolymer

    Science.gov (United States)

    Chang, I.; Cho, G. C.; Kwon, Y. M.; Im, J.

    2017-12-01

    The importance and demands of offshore and coastal area development are increasing due to shortage of usable land and to have access to valuable marine resources. However, most coastal soils are soft sediments, mainly composed with fines (silt and clay) and having high water and organic contents, which induce complicated mechanical- and geochemical- behaviors and even be insufficient in Geotechnical engineering aspects. At least, soil stabilization procedures are required for those soft sediments, regardless of the purpose of usage on the site. One of the most common soft soil stabilization method is using ordinary cement as a soil strengthening binder. However, the use of cement in marine environments is reported to occur environmental concerns such as pH increase and accompanying marine ecosystem disturbance. Therefore, a new environmentally-friendly treatment material for coastal and offshore soils. In this study, a biopolymer material produced by microbes is introduced to enhance the physical behavior of a soft tidal flat sediment by considering the biopolymer rheology, soil mineralogy, and chemical properties of marine water. Biopolymer material used in this study forms inter-particle bonds between particles which is promoted through cation-bridges where the cations are provided from marine water. Moreover, biopolymer treatment renders unique stress-strain relationship of soft soils. The mechanical stiffness (M) instantly increase with the presence of biopolymer, while time-dependent settlement behavior (consolidation) shows a big delay due to the viscous biopolymer hydrogels in pore spaces.

  3. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  4. Soft, curved electrode systems capable of integration on the auricle as a persistent brain-computer interface.

    Science.gov (United States)

    Norton, James J S; Lee, Dong Sup; Lee, Jung Woo; Lee, Woosik; Kwon, Ohjin; Won, Phillip; Jung, Sung-Young; Cheng, Huanyu; Jeong, Jae-Woong; Akce, Abdullah; Umunna, Stephen; Na, Ilyoun; Kwon, Yong Ho; Wang, Xiao-Qi; Liu, ZhuangJian; Paik, Ungyu; Huang, Yonggang; Bretl, Timothy; Yeo, Woon-Hong; Rogers, John A

    2015-03-31

    Recent advances in electrodes for noninvasive recording of electroencephalograms expand opportunities collecting such data for diagnosis of neurological disorders and brain-computer interfaces. Existing technologies, however, cannot be used effectively in continuous, uninterrupted modes for more than a few days due to irritation and irreversible degradation in the electrical and mechanical properties of the skin interface. Here we introduce a soft, foldable collection of electrodes in open, fractal mesh geometries that can mount directly and chronically on the complex surface topology of the auricle and the mastoid, to provide high-fidelity and long-term capture of electroencephalograms in ways that avoid any significant thermal, electrical, or mechanical loading of the skin. Experimental and computational studies establish the fundamental aspects of the bending and stretching mechanics that enable this type of intimate integration on the highly irregular and textured surfaces of the auricle. Cell level tests and thermal imaging studies establish the biocompatibility and wearability of such systems, with examples of high-quality measurements over periods of 2 wk with devices that remain mounted throughout daily activities including vigorous exercise, swimming, sleeping, and bathing. Demonstrations include a text speller with a steady-state visually evoked potential-based brain-computer interface and elicitation of an event-related potential (P300 wave).

  5. The Soft Stowage® catalog: A new approach to procuring space qualified hardware

    Science.gov (United States)

    Smith, David A.

    2000-01-01

    The patented Soft Stowage® Human Space Logistics System had already proven itself within the Shuttle system of reusable carriers where it has been used extensively to transport cargo both up to and down from the Russian Mir Space Station. For the International Space Station (ISS) however, Boeing wanted to offer a seamless product line that offered launch/landing and orbital stowage hardware, as well as associated integration services that reduce the time, documentation, and cost of transporting goods between earth and earth orbit. To meet that objective Boeing developed a comprehensive Soft Stowage® commercial catalog that offers both fixed pricing and delivery of standard items six weeks from order. The ability to obtain modular stowage accommodation elements through a standardized catalog promises to significantly reduce the cost and time to get payload to orbit. To date, Boeing's Soft Stowage® Catalog has supported delivery of over 600 elements to Spacelab, SPACEHAB, ISS and other payload customers. .

  6. Novel Computational Approaches to Drug Discovery

    Science.gov (United States)

    Skolnick, Jeffrey; Brylinski, Michal

    2010-01-01

    New approaches to protein functional inference based on protein structure and evolution are described. First, FINDSITE, a threading based approach to protein function prediction, is summarized. Then, the results of large scale benchmarking of ligand binding site prediction, ligand screening, including applications to HIV protease, and GO molecular functional inference are presented. A key advantage of FINDSITE is its ability to use low resolution, predicted structures as well as high resolution experimental structures. Then, an extension of FINDSITE to ligand screening in GPCRs using predicted GPCR structures, FINDSITE/QDOCKX, is presented. This is a particularly difficult case as there are few experimentally solved GPCR structures. Thus, we first train on a subset of known binding ligands for a set of GPCRs; this is then followed by benchmarking against a large ligand library. For the virtual ligand screening of a number of Dopamine receptors, encouraging results are seen, with significant enrichment in identified ligands over those found in the training set. Thus, FINDSITE and its extensions represent a powerful approach to the successful prediction of a variety of molecular functions.

  7. BASES OF COMPETENCE APPROACH IN COMPUTER SCIENCE TEACHER TRAINING

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kovalska

    2010-08-01

    Full Text Available In the article the analysis of methodological and theoretical bases of competence approach has been made. It is given the structure and classification of computer science teacher competences.

  8. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  9. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  10. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  11. Some Properties of Fuzzy Soft Proximity Spaces

    Science.gov (United States)

    Demir, İzzettin; Özbakır, Oya Bedre

    2015-01-01

    We study the fuzzy soft proximity spaces in Katsaras's sense. First, we show how a fuzzy soft topology is derived from a fuzzy soft proximity. Also, we define the notion of fuzzy soft δ-neighborhood in the fuzzy soft proximity space which offers an alternative approach to the study of fuzzy soft proximity spaces. Later, we obtain the initial fuzzy soft proximity determined by a family of fuzzy soft proximities. Finally, we investigate relationship between fuzzy soft proximities and proximities. PMID:25793224

  12. Evaluation of Biological Activity and Computer-Aided Design of New Soft Glucocorticoids.

    Science.gov (United States)

    Dobričić, Vladimir; Jaćević, Vesna; Vučićević, Jelica; Nikolic, Katarina; Vladimirov, Sote; Čudina, Olivera

    2017-05-01

    Soft glucocorticoids are compounds that are biotransformed to inactive and non-toxic metabolites and have fewer side effects than traditional glucocorticoids. A new class of 17β-carboxamide steroids has been recently introduced by our group. In this study, local anti-inflammatory activity of these derivatives was evaluated by use of the croton oil-induced ear edema test. Glucocorticoids with the highest maximal edema inhibition (MEI) were pointed out, and the systemic side effects of those with the lowest EC 50 values were significantly lower in comparison to dexamethasone. A 3D-QSAR model was created and employed for the design of 27 compounds. By use of the sequential combination of ligand-based and structure-based virtual screening, three compounds were selected from the ChEMBL library and used as a starting point for the design of 15 derivatives. Molecular docking analysis of the designed derivatives with the highest predicted MEI and relative glucocorticoid receptor binding affinity (20, 22, 24-1, 25-1, 27, VS7, VS13, and VS14) confirmed the presence of interactions with the glucocorticoid receptor that are important for the activity. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Biologically motivated computationally intensive approaches to image pattern recognition

    NARCIS (Netherlands)

    Petkov, Nikolay

    This paper presents some of the research activities of the research group in vision as a grand challenge problem whose solution is estimated to need the power of Tflop/s computers and for which computational methods have yet to be developed. The concerned approaches are biologically motivated, in

  14. Starting Computer Science Using C++ with Objects: A Workable Approach.

    Science.gov (United States)

    Connolly, Mary V.

    Saint Mary's College (Indiana) offers a minor program in computer science. The program's introductory computer science class traditionally taught Pascal. The decision to change the introductory programming language to C++ with an object oriented approach was made when it became clear that there were good texts available for beginning students.…

  15. Computational fluid dynamics in ventilation: Practical approach

    Science.gov (United States)

    Fontaine, J. R.

    The potential of computation fluid dynamics (CFD) for conceiving ventilation systems is shown through the simulation of five practical cases. The following examples are considered: capture of pollutants on a surface treating tank equipped with a unilateral suction slot in the presence of a disturbing air draft opposed to suction; dispersion of solid aerosols inside fume cupboards; performances comparison of two general ventilation systems in a silkscreen printing workshop; ventilation of a large open painting area; and oil fog removal inside a mechanical engineering workshop. Whereas the two first problems are analyzed through two dimensional numerical simulations, the three other cases require three dimensional modeling. For the surface treating tank case, numerical results are compared to laboratory experiment data. All simulations are carried out using EOL, a CFD software specially devised to deal with air quality problems in industrial ventilated premises. It contains many analysis tools to interpret the results in terms familiar to the industrial hygienist. Much experimental work has been engaged to validate the predictions of EOL for ventilation flows.

  16. "No zone" approach in penetrating neck trauma reduces unnecessary computed tomography angiography and negative explorations.

    Science.gov (United States)

    Ibraheem, Kareem; Khan, Muhammad; Rhee, Peter; Azim, Asad; O'Keeffe, Terence; Tang, Andrew; Kulvatunyou, Narong; Joseph, Bellal

    2018-01-01

    The most recent management guidelines advocate computed tomography angiography (CTA) for any suspected vascular or aero-digestive injuries in all zones and give zone II injuries special consideration. We hypothesized that physical examination can safely guide CTA use in a "no zone" approach. An 8-year retrospective analysis of all adult trauma patients with penetrating neck trauma (PNT) was performed. We included all patients in whom the platysma was violated. Patients were classified into three groups as follows: hard signs, soft signs, and asymptomatic. CTA use, positive CTA (contrast extravasation, dissection, or intimal flap) and operative details were reported. Primary outcomes were positive CTA and therapeutic neck exploration (TNE) (defined by repair of major vascular or aero-digestive injuries). A total of 337 patients with PNT met the inclusion criteria. Eighty-two patients had hard signs and all of them went to the operating room, of which 59 (72%) had TNE. One hundred fifty-six patients had soft signs, of which CTA was performed in 121 (78%), with positive findings in 12 (10%) patients. The remaining 35 (22%) underwent initial neck exploration, of which 14 (40%) were therapeutic yielding a high rate of negative exploration. Ninty-nine patients were asymptomatic, of which CTA was performed in 79 (80%), with positive findings in 3 (4%), however, none of these patients required TNE. On sub analysis based on symptoms, there was no difference in the rate of TNE between the neck zones in patients with hard signs (P = 0.23) or soft signs (P = 0.51). Regardless of the zone of injury, asymptomatic patients did not require a TNE. Physical examination regardless of the zone of injury should be the primary guide to CTA or TNE in patients with PNT. Following traditional zone-based guidelines can result in unnecessary negative explorations in patients with soft signs and may need rethinking. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. STANDARDISED CLINICAL EXAMINATION OF SOFT-TISSUE PAIN IN PATIENTS WITH HIP DYSPLASIA USING THE CLINICAL ENTITIES APPROACH

    DEFF Research Database (Denmark)

    Jacobsen, Julie Sandell; Hölmich, Per; Thorborg, Kristian

    2016-01-01

    ) and 12% (n=6) in the hip adductors. Hamstrings and rectus abdominis entities were less common with a prevalence of 4% (n=2) and 0% (n=0), respectively. The clinical entities are reported in Table 3. Conclusion Clinical entities suggestive of soft-tissue pathology in the hip region are common with a high...... and reliable protocol. The aim of this study was to investigate five clinical entities in 100 patients with hip dysplasia using the clinical entities approach identifying the anatomic location of soft-tissue pain. The first 50 patients are presented in this paper. Material and Methods Fifty patients (10 males......, 40 females), with a median age of 26 (15-49) years were included (Table 1). The standardised examination protocol included evaluation of “known” pain in the muscles, tendons and at their insertion point provoked by palpation, contraction or stretching (Figure 1-5). Clinical entities were predefined...

  18. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  19. Aluminium in Biological Environments: A Computational Approach

    Science.gov (United States)

    Mujika, Jon I; Rezabal, Elixabete; Mercero, Jose M; Ruipérez, Fernando; Costa, Dominique; Ugalde, Jesus M; Lopez, Xabier

    2014-01-01

    The increased availability of aluminium in biological environments, due to human intervention in the last century, raises concerns on the effects that this so far “excluded from biology” metal might have on living organisms. Consequently, the bioinorganic chemistry of aluminium has emerged as a very active field of research. This review will focus on our contributions to this field, based on computational studies that can yield an understanding of the aluminum biochemistry at a molecular level. Aluminium can interact and be stabilized in biological environments by complexing with both low molecular mass chelants and high molecular mass peptides. The speciation of the metal is, nonetheless, dictated by the hydrolytic species dominant in each case and which vary according to the pH condition of the medium. In blood, citrate and serum transferrin are identified as the main low molecular mass and high molecular mass molecules interacting with aluminium. The complexation of aluminium to citrate and the subsequent changes exerted on the deprotonation pathways of its tritable groups will be discussed along with the mechanisms for the intake and release of aluminium in serum transferrin at two pH conditions, physiological neutral and endosomatic acidic. Aluminium can substitute other metals, in particular magnesium, in protein buried sites and trigger conformational disorder and alteration of the protonation states of the protein's sidechains. A detailed account of the interaction of aluminium with proteic sidechains will be given. Finally, it will be described how alumnium can exert oxidative stress by stabilizing superoxide radicals either as mononuclear aluminium or clustered in boehmite. The possibility of promotion of Fenton reaction, and production of hydroxyl radicals will also be discussed. PMID:24757505

  20. A New Soft Computing Method for K-Harmonic Means Clustering.

    Science.gov (United States)

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  1. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  2. What is Intrinsic Motivation? A Typology of Computational Approaches.

    Science.gov (United States)

    Oudeyer, Pierre-Yves; Kaplan, Frederic

    2007-01-01

    Intrinsic motivation, centrally involved in spontaneous exploration and curiosity, is a crucial concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  3. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  4. A methodological approach to assessing alveolar ridge preservation procedures in humans: soft tissue profile.

    Science.gov (United States)

    Vanhoutte, Vanessa; Rompen, Eric; Lecloux, Geoffrey; Rues, Stefan; Schmitter, Marc; Lambert, France

    2014-03-01

    The aesthetic results of implant restoration in the anterior maxilla are particularly related to the soft tissue profile. Although socket preservation techniques appear to reduce bone remodelling after tooth extraction, there is still few investigations assessing the external soft tissue profile after such procedures. The goal of this study was to describe an accurate technique to evaluate soft tissue contour changes after performing socket preservation procedures. The secondary objective was to apply the newly developed measuring method to a specific socket preservation using a "saddled" connective tissue graft combined with the insertion of slowly resorbable biomaterials into the socket. A total of 14 patients needing tooth replacement in the aesthetic region were included to receive a socket preservation procedure using a connective tissue graft. Impressions were taken before the tooth extraction (baseline) and at 2, 4, and 12 weeks after the procedure. The corresponding plaster casts were scanned, and the evolution of the soft tissue profile in relation to the baseline situation was assessed using imaging software. The measuring technique allowed assessing the soft tissue profiles accurately at different levels of the alveolar process. The insertion of a saddled connective tissue appeared to compensate for the horizontal and vertical bone remodelling after a socket preservation procedure in most regions of the alveolar crest. After 12 weeks, the only significant change was located in the more cervical and central region of the alveolar process and reached a median drop of 0.62 mm from baseline. Within the limitations of this study, we found that a saddled connective tissue graft combined with a socket preservation procedure could almost completely counteract the bone remodelling in terms of the external soft tissue profile. The minor changes found in the cervical region might disappear with the emergence profile of the prosthodontic components. The described

  5. Outcomes of percutaneous endoscopic lumbar discectomy via a translaminar approach, especially for soft, highly down-migrated lumbar disc herniation.

    Science.gov (United States)

    Du, Jianwei; Tang, Xiangyu; Jing, Xin; Li, Ningdao; Wang, Yan; Zhang, Xifeng

    2016-06-01

    This study reports a new approach for percutaneous endoscopic lumbar iscectomy (PELD), especially for soft, highly down-migrated lumbar disc herniation. Seven patients with soft, highly down-migrated lumbar disc herniation who underwent PELD via a translaminar approach under local anaesthesia from January 2013 to June 2015, including five patients who underwent failed PELD in other hospitals, were retrospectively analyzed. Clinical outcomes were evaluated according to pre-operative and post-operative visual analogue scale (VAS) scores, Oswestry disability index (ODI) scores and post-operative magnetic resonance imaging (MRI). The highly down-migrated lumbar disc herniation was completely removed by PELD via a translaminar approach in seven patients, as confirmed by post-operative MRI. Leg pain was eased after removal of the disc migrations. The mean follow-up duration was 9.8 (range, 6-14) months. The mean pre-operative VAS was 7.6 ± 0.8 (range, 6-9), which decreased to 3.1 ± 1.5 (range, 2-5) at one week post-operatively and to 1.3 ± 0.8 (range, 0-3) by the last follow-up visit. The mean pre-operative ODI was 61.6 (range, 46-84), which decreased to 16.3 (range, 10-28) at the one month post-operative follow-up and to 8.4 (range, 0-14) by the last follow-up visit. No recurrence was observed in any of the seven patients during the follow-up period. PELD via a translaminar approach could be a good alternative option for the treatment of soft, highly down-migrated lumbar disc herniation.

  6. A Cryogenic 1 GSa/s, Soft-Core FPGA ADC for Quantum Computing Applications

    NARCIS (Netherlands)

    Homulle, H.A.R.; Charbon, E.E.E.

    2016-01-01

    We propose an analog-to-digital converter (ADC) architecture, implemented in an FPGA, that is fully reconfigurable and easy to calibrate. This approach allows to alter the design, according to the system requirements, with simple modifications in the firmware. Therefore it can be used in a wide

  7. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  8. Soft-information flipping approach in multi-head multi-track BPMR systems

    Science.gov (United States)

    Warisarn, C.; Busyatras, W.; Myint, L. M. M.

    2018-05-01

    Inter-track interference is one of the most severe impairments in bit-patterned media recording system. This impairment can be effectively handled by a modulation code and a multi-head array jointly processing multiple tracks; however, such a modulation constraint has never been utilized to improve the soft-information. Therefore, this paper proposes the utilization of modulation codes with an encoded constraint defined by the criteria for soft-information flipping during a three-track data detection process. Moreover, we also investigate the optimal offset position of readheads to provide the most improvement in system performance. The simulation results indicate that the proposed systems with and without position jitter are significantly superior to uncoded systems.

  9. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects....

  10. A staged approach of implant placement in immediate extraction sockets for preservation of peri-implant soft and hard tissue.

    Science.gov (United States)

    Vinnakota, Dileep Nag; Akula, Sreenivasa Rao; Krishna Reddy, V Vamsi; Sankar, V Vijay

    2014-03-01

    Esthetic zone restoration is a challenging aspect in implant dentistry because of two critical factors such as level of bone support and soft tissue dimensions. Preservation of healthy peri-implant tissues is of primary importance for ensuring better esthetics over an extended period. The aim of the present case-series was to evaluate a new staged approach of implant placement in immediate extraction sockets for preservation of peri-implant soft and hard tissues. Four subjects scheduled for extraction of teeth in the esthetic zone with neither a periapical nor periodontal infection and with thick tissue biotype were included. For all the subjects sand blasted, large grit, acid etched platform switched implant with a diameter 2 mm less than the diameter of extraction socket and a conical abutment-implant connection (Morse taper) were placed 2 mm below the crest of the socket, with almost 2 mm gap between the labial plate and the implant with shoulder placed palatally/lingually. The implants were loaded after 2 months healing period and followed for a period of 1-2 years. In all the four patients there was preservation of both hard and soft tissues around the implant with a good esthetic outcome in all the follow up visits. Integrating immediate placement with stable implant-abutment connection, platform switching concept and careful case selection, we can achieve a very good esthetic outcome.

  11. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners

    Science.gov (United States)

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thresholded images are undergone anthropometric measurements taking in to account the scale factor of pixel size to metric scale. The proposed method is finally applied to plantar images obtained through scanning feet of randomly selected subjects by a foot scanner system as our experimental setup described in the paper. Running computation time and the effects of GLSC parameters are investigated in the simulation results. PMID:24083133

  12. Soft-tissue perineurioma of the retroperitoneum in a 63-year-old man, computed tomography and magnetic resonance imaging findings: a case report

    Directory of Open Access Journals (Sweden)

    Yasumoto Mayumi

    2010-08-01

    Full Text Available Abstract Introduction Soft-tissue perineuriomas are rare benign peripheral nerve sheath tumors in the subcutis of the extremities and the trunks of young patients. To our knowledge, this the first presentation of the computed tomography and magnetic resonance imaging of a soft-tissue perineurioma in the retroperitoneum with pathologic correlation. Case presentation A 63-year-old Japanese man was referred for assessment of high blood pressure. Abdominal computed tomography and magnetic resonance imaging showed a well-defined, gradually enhancing tumor without focal degeneration or hemorrhage adjacent to the pancreatic body. Tumor excision with distal pancreatectomy and splenectomy was performed, as a malignant tumor of pancreatic origin could not be ruled out. No recurrence has been noted in the 16 months since the operation. Pathologic examination of the tumor revealed a soft-tissue perineurioma of the retroperitoneum. Conclusion Although the definitive diagnosis of soft-tissue perineurioma requires biopsy and immunohistochemical reactivity evaluation, the computed tomography and magnetic resonance imaging findings described in this report suggest inclusion of this rare tumor in the differential diagnosis when such findings occur in the retroperitoneum.

  13. Soft x-ray continuum radiation transmitted through metallic filters: An analytical approach to fast electron temperature measurements

    International Nuclear Information System (INIS)

    Delgado-Aparicio, L.; Hill, K.; Bitter, M.; Tritz, K.; Kramer, T.; Stutman, D.; Finkenthal, M.

    2010-01-01

    A new set of analytic formulas describes the transmission of soft x-ray continuum radiation through a metallic foil for its application to fast electron temperature measurements in fusion plasmas. This novel approach shows good agreement with numerical calculations over a wide range of plasma temperatures in contrast with the solutions obtained when using a transmission approximated by a single-Heaviside function [S. von Goeler et al., Rev. Sci. Instrum. 70, 599 (1999)]. The new analytic formulas can improve the interpretation of the experimental results and thus contribute in obtaining fast temperature measurements in between intermittent Thomson scattering data.

  14. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A.

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  15. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  16. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  17. An approach to computing direction relations between separated object groups

    Science.gov (United States)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  18. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  19. Soft computing for modeling punching shear of reinforced concrete flat slabs

    Directory of Open Access Journals (Sweden)

    Iyad Alkroosh

    2015-06-01

    Full Text Available This paper presents applying gene expression programming (GEP approach for predicting the punching shear strength of normal and high strength reinforced concrete flat slabs. The GEP model was developed and verified using 58 case histories that involve measured punching shear strength. The modeling was carried out by dividing the data into two sets: a training set for model calibration, and a validation set for verifying the generalization capability of the model. It is shown that the model is able to learn with high accuracy the complex relationship between the punching shear and the factors affecting it and produces this knowledge in the form of a function. The results have demonstrated that the GEP model performs very well with coefficient of determination, mean, standard deviation and probability density at 50% equivalent to 0.98, 0.99, 0.10 and 0.99, respectively. Moreover, the GEP predicts punching shear strength more accurately than the traditional methods.

  20. Estimation of Rivers Dissolved Solids TDS by Soft Computing (Case Study: Upstream of Boukan Dam

    Directory of Open Access Journals (Sweden)

    S. Zaman Zad Ghavidel

    2017-01-01

    layer with five inputs, one hidden and output layer with three and two neurons for Anyan and Safakhaneh hydrometer stations, respectively. Similar ANN, ANFIS-SC5 model had the best performance. It is clear that the ANFIS with 0/4 and 0/7 radii value has the highest R and the lowest RMSE for Anyan and Safakhaneh hydrometer stations, respectively. Various GEP models have been developed using the input combinations similar ANN and ANFIS models. Comparing the GEP5 estimations with the measured data for the test stage demonstrates a high generalization capacity of the model, with relatively low error and high correlation. From the scatter plots it is obviously seen that the GEP5 predictions are closer to the corresponding measured TDS than other models. As seen from the best straight line equations (assume the equation as y=ax in the scatter plots that the a coefficient for GEP5 is closer to 1 than other models. In addition to previous operation, Gene expression programming offered mathematical relationships in the stations of Anyan and Safakhane with the correlation coefficients, respectively 0.962 , 0.971 and with Root-mean-square errors, respectively 12.82 , 29.08 in order to predict dissolved solids (TDS in the rivers located at upstream of the dam. The obtained results showed the efficiency of the applied models in simulating the nonlinear behavior of TDS variations in terms of performance indices. Overall, the GEP model outperformed the other models. For all of applied models, the best result was obtained by application of input combination (5 including HCO3, Ca, Na, Q and Mg. The results are also tested by using t test for verifying the robustness of the models at 95% significance level. Comparison results indicated that the poorest model in TDS simulation was ANN especially in test period. The observed relationship between residuals and model computed TDS values shows complete independence and random distribution. It is further supported by the respective

  1. Strength development in concrete with wood ash blended cement and use of soft computing models to predict strength parameters

    Directory of Open Access Journals (Sweden)

    S. Chowdhury

    2015-11-01

    Full Text Available In this study, Wood Ash (WA prepared from the uncontrolled burning of the saw dust is evaluated for its suitability as partial cement replacement in conventional concrete. The saw dust has been acquired from a wood polishing unit. The physical, chemical and mineralogical characteristics of WA is presented and analyzed. The strength parameters (compressive strength, split tensile strength and flexural strength of concrete with blended WA cement are evaluated and studied. Two different water-to-binder ratio (0.4 and 0.45 and five different replacement percentages of WA (5%, 10%, 15%, 18% and 20% including control specimens for both water-to-cement ratio is considered. Results of compressive strength, split tensile strength and flexural strength showed that the strength properties of concrete mixture decreased marginally with increase in wood ash contents, but strength increased with later age. The XRD test results and chemical analysis of WA showed that it contains amorphous silica and thus can be used as cement replacing material. Through the analysis of results obtained in this study, it was concluded that WA could be blended with cement without adversely affecting the strength properties of concrete. Also using a new statistical theory of the Support Vector Machine (SVM, strength parameters were predicted by developing a suitable model and as a result, the application of soft computing in structural engineering has been successfully presented in this research paper.

  2. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    Science.gov (United States)

    Patra, S. R.

    2017-12-01

    minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.

  3. Comparative Analysis of Soft Computing Models in Prediction of Bending Rigidity of Cotton Woven Fabrics

    Science.gov (United States)

    Guruprasad, R.; Behera, B. K.

    2015-10-01

    Quantitative prediction of fabric mechanical properties is an essential requirement for design engineering of textile and apparel products. In this work, the possibility of prediction of bending rigidity of cotton woven fabrics has been explored with the application of Artificial Neural Network (ANN) and two hybrid methodologies, namely Neuro-genetic modeling and Adaptive Neuro-Fuzzy Inference System (ANFIS) modeling. For this purpose, a set of cotton woven grey fabrics was desized, scoured and relaxed. The fabrics were then conditioned and tested for bending properties. With the database thus created, a neural network model was first developed using back propagation as the learning algorithm. The second model was developed by applying a hybrid learning strategy, in which genetic algorithm was first used as a learning algorithm to optimize the number of neurons and connection weights of the neural network. The Genetic algorithm optimized network structure was further allowed to learn using back propagation algorithm. In the third model, an ANFIS modeling approach was attempted to map the input-output data. The prediction performances of the models were compared and a sensitivity analysis was reported. The results show that the prediction by neuro-genetic and ANFIS models were better in comparison with that of back propagation neural network model.

  4. Soft tissue sarcomas: From a morphological to a molecular biological approach.

    Science.gov (United States)

    Oda, Yoshinao; Yamamoto, Hidetaka; Kohashi, Kenichi; Yamada, Yuichi; Iura, Kunio; Ishii, Takeaki; Maekawa, Akira; Bekki, Hirofumi

    2017-09-01

    Recently developed molecular genetic techniques have led to the elucidation of tumor-specific genomic alterations and thereby the reclassification of tumor entities of soft tissue sarcoma. A solitary fibrous tumor-mimicking tumor with the AHRR-NCOA2 gene has been isolated as angiofibroma of soft tissue. As for small round cell sarcomas, novel fusion genes such as CIC-DUX4 and BCOR-CCNB3 have been identified in these tumor groups. SMARCB1/INI1 deficient tumors with round cell morphology are also expected to be reclassified in three types, based on the combination of their morphology and genotype. The identification of the MDM2 gene amplification in pleomorphic sarcomas has extended the entity of dedifferentiated liposarcoma (DDLS). Our recent molecular investigations elucidated candidates for novel therapeutic strategies. Activation of the Akt-mTOR pathway was correlated with poor prognosis or tumor grade in spindle cell sarcomas including malignant peripheral nerve sheath tumor. In vitro and in vivo studies of transcription factor Forkhead Box M1 (FOXM1) demonstrated the close correlation between aggressive biological behavior or chemosensitivity and FOXM1 expression in synovial sarcoma, so far. Finally, in regard to the investigation of cancer-testis antigens, myxoid/round cell liposarcoma and synovial sarcoma showed frequent and high expression of PRAME and NY-ESO-1. © 2017 Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.

  5. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  6. Developing students' worksheets applying soft skill-based scientific approach for improving building engineering students' competencies in vocational high schools

    Science.gov (United States)

    Suparno, Sudomo, Rahardjo, Boedi

    2017-09-01

    Experts and practitioners agree that the quality of vocational high schools needs to be greatly improved. Many construction services have voiced their dissatisfaction with today's low-quality vocational high school graduates. The low quality of graduates is closely related to the quality of the teaching and learning process, particularly teaching materials. In their efforts to improve the quality of vocational high school education, the government have implemented Curriculum 2013 (K13) and supplied teaching materials. However, the results of monitoring and evaluation done by the Directorate of Vocational High School, Directorate General of Secondary Education (2014), the provision of tasks for students in the teaching materials was totally inadequate. Therefore, to enhance the quality and the result of the instructional process, there should be provided students' worksheets that can stimulate and improve students' problem-solving skills and soft skills. In order to develop worksheets that can meet the academic requirements, the development needs to be in accordance with an innovative learning approach, which is the soft skill-based scientific approach.

  7. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  8. New Theoretical Approaches for Human-Computer Interaction.

    Science.gov (United States)

    Rogers, Yvonne

    2004-01-01

    Presents a critique of recent theoretical developments in the field of human-computer interaction (HCI) together with an overview of HCI practice. This chapter discusses why theoretically based approaches have had little impact on the practice of interaction design and suggests mechanisms to enable designers and researchers to better articulate…

  9. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  10. Conformational dynamics of proanthocyanidins: physical and computational approaches

    Science.gov (United States)

    Fred L. Tobiason; Richard W. Hemingway; T. Hatano

    1998-01-01

    The interaction of plant polyphenols with proteins accounts for a good part of their commercial (e.g., leather manufacture) and biological (e.g., antimicrobial activity) significance. The interplay between observations of physical data such as crystal structure, NMR analyses, and time-resolved fluorescence with results of computational chemistry approaches has been...

  11. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes of ...

  12. Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software

    Science.gov (United States)

    Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg

    2017-09-01

    100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.

  13. Cloud Computing - A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  14. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  15. Soft-tissue injuries of the fingertip: methods of evaluation and treatment. An algorithmic approach.

    Science.gov (United States)

    Lemmon, Joshua A; Janis, Jeffrey E; Rohrich, Rod J

    2008-09-01

    After studying this article, the participant should be able to: 1. Understand the anatomy of the fingertip. 2. Describe the methods of evaluating fingertip injuries. 3. Discuss reconstructive options for various tip injuries. The fingertip is the most commonly injured part of the hand, and therefore fingertip injuries are among the most frequent injuries that plastic surgeons are asked to treat. Although microsurgical techniques have enabled replantation of even very distal tip amputations, it is relatively uncommon that a distal tip injury will be appropriate for replantation. In the event that replantation is not pursued, options for distal tip soft-tissue reconstruction must be considered. This review presents a straightforward method for evaluating fingertip injuries and provides an algorithm for fingertip reconstruction.

  16. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  17. Computational intelligence approaches for pattern discovery in biological systems.

    Science.gov (United States)

    Fogel, Gary B

    2008-07-01

    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  18. A comparison between ten advanced and soft computing models for groundwater qanat potential assessment in Iran using R and GIS

    Science.gov (United States)

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Abbaspour, Karim

    2018-02-01

    Considering the unstable condition of water resources in Iran and many other countries in arid and semi-arid regions, groundwater studies are very important. Therefore, the aim of this study is to model groundwater potential by qanat locations as indicators and ten advanced and soft computing models applied to the Beheshtabad Watershed, Iran. Qanat is a man-made underground construction which gathers groundwater from higher altitudes and transmits it to low land areas where it can be used for different purposes. For this purpose, at first, the location of the qanats was detected using extensive field surveys. These qanats were classified into two datasets including training (70%) and validation (30%). Then, 14 influence factors depicting the region's physical, morphological, lithological, and hydrological features were identified to model groundwater potential. Linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), flexible discriminant analysis (FDA), penalized discriminant analysis (PDA), boosted regression tree (BRT), random forest (RF), artificial neural network (ANN), K-nearest neighbor (KNN), multivariate adaptive regression splines (MARS), and support vector machine (SVM) models were applied in R scripts to produce groundwater potential maps. For evaluation of the performance accuracies of the developed models, ROC curve and kappa index were implemented. According to the results, RF had the best performance, followed by SVM and BRT models. Our results showed that qanat locations could be used as a good indicator for groundwater potential. Furthermore, altitude, slope, plan curvature, and profile curvature were found to be the most important influence factors. On the other hand, lithology, land use, and slope aspect were the least significant factors. The methodology in the current study could be used by land use and terrestrial planners and water resource managers to reduce the costs of groundwater resource discovery.

  19. Bottom-up approach to moduli dynamics in heavy gravitino scenario: Superpotential, soft terms, and sparticle mass spectrum

    International Nuclear Information System (INIS)

    Endo, Motoi; Yamaguchi, Masahiro; Yoshioka, Koichi

    2005-01-01

    The physics of moduli fields is examined in the scenario where the gravitino is relatively heavy with mass of order 10 TeV, which is favored in view of the severe gravitino problem. The form of the moduli superpotential is shown to be determined, if one imposes a phenomenological requirement that no physical CP phase arise in gaugino masses from conformal anomaly mediation. This bottom-up approach allows only two types of superpotential, each of which can have its origins in a fundamental underlying theory such as superstring. One superpotential is the sum of an exponential and a constant, which is identical to that obtained by Kachru et al. (KKLT), and the other is the racetrack superpotential with two exponentials. The general form of soft supersymmetry-breaking masses is derived, and the pattern of the superparticle mass spectrum in the minimal supersymmetric standard model is discussed with the KKLT-type superpotential. It is shown that the moduli mediation and the anomaly mediation make comparable contributions to the soft masses. At the weak scale, the gaugino masses are rather degenerate compared to the minimal supergravity, which bring characteristic features on the superparticle masses. In particular, the lightest neutralino, which often constitutes the lightest superparticle and thus a dark matter candidate, is a considerable admixture of gauginos and Higgsinos. We also find a small mass hierarchy among the moduli, gravitino, and superpartners of the standard-model fields. Cosmological implications of the scenario are briefly described

  20. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    Beichel, R.R.; Sonka, M.

    2006-01-01

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  1. Computational Approach for Quantifying Structural Disorder in Biomolecular Lattices

    Science.gov (United States)

    Bratton, Clayton; Reiser, Karen; Knoesen, Andre; Yankelevich, Diego; Wang, Mingshi; Rocha-Mendoza, Israel

    2009-11-01

    We have developed a novel computational approach for quantifying structural disorder in biomolecular lattices with nonlinear susceptibility based on analysis of polarization-modulated second harmonic signal. Transient, regional disorder at the level of molecular organization is identified using a novel signal-processing algorithms sufficiently compact for near real-time analysis with a desktop computer. Global disorder and regional disorder within the biostructure are assessed and scored using a multiple methodologies. Experimental results suggest our signal processing method represents a robust, scalable tool that allows us to detect both regional and global alterations in signal characteristics of biostructures with a high degree of discrimination.

  2. Biomechanical Model for Computing Deformations for Whole-Body Image Registration: A Meshless Approach

    Science.gov (United States)

    Li, Mao; Miller, Karol; Joldes, Grand Roman; Kikinis, Ron; Wittek, Adam

    2016-01-01

    Patient-specific biomechanical models have been advocated as a tool for predicting deformations of soft body organs/tissue for medical image registration (aligning two sets of images) when differences between the images are large. However, complex and irregular geometry of the body organs makes generation of patient-specific biomechanical models very time consuming. Meshless discretisation has been proposed to solve this challenge. However, applications so far have been limited to 2-D models and computing single organ deformations. In this study, 3-D comprehensive patient-specific non-linear biomechanical models implemented using Meshless Total Lagrangian Explicit Dynamics (MTLED) algorithms are applied to predict a 3-D deformation field for whole-body image registration. Unlike a conventional approach which requires dividing (segmenting) the image into non-overlapping constituents representing different organs/tissues, the mechanical properties are assigned using the Fuzzy C-Means (FCM) algorithm without the image segmentation. Verification indicates that the deformations predicted using the proposed meshless approach are for practical purposes the same as those obtained using the previously validated finite element models. To quantitatively evaluate the accuracy of the predicted deformations, we determined the spatial misalignment between the registered (i.e. source images warped using the predicted deformations) and target images by computing the edge-based Hausdorff distance. The Hausdorff distance-based evaluation determines that our meshless models led to successful registration of the vast majority of the image features. PMID:26791945

  3. Approach to skin and soft tissue infections in non-HIV immunocompromised hosts.

    Science.gov (United States)

    Burke, Victoria E; Lopez, Fred A

    2017-08-01

    Skin and soft tissue infections are frequent contributors to morbidity and mortality in the immunocompromised host. This article reviews the changing epidemiology and clinical manifestations of the most common cutaneous pathogens in non-HIV immunocompromised hosts, including patients with solid organ transplants, stem cell transplants, solid tumors, hematologic malignancies, and receiving chronic immunosuppressive therapy for inflammatory disorders. Defects in the innate or adaptive immune response can predispose the immunocompromised host to certain cutaneous infections in a predictive fashion. Cutaneous lesions in patients with neutrophil defects are commonly due to bacteria, Candida, or invasive molds. Skin lesions in patients with cellular or humoral immunodeficiencies can be due to encapsulated bacteria, Nocardia, mycobacteria, endemic fungal infections, herpesviruses, or parasites. Skin lesions may reflect primary inoculation or, more commonly, disseminated infection. Tissue samples for microscopy, culture, and histopathology are critical to making an accurate diagnosis given the nonspecific and heterogeneous appearance of these skin lesions due to a blunted immune response. As the population of non-HIV immunosuppressed hosts expands with advances in medical therapies, the frequency and variety of cutaneous diseases in these hosts will increase.

  4. Nanoscale Viscoelasticity of Extracellular Matrix Proteins in Soft Tissues: a Multiscale Approach

    Science.gov (United States)

    Miri, Amir K.; Heris, Hossein K.; Mongeau, Luc; Javid, Farhad

    2013-01-01

    We propose that the bulk viscoelasticity of soft tissues results from two length-scale-dependent mechanisms: the time-dependent response of extracellular matrix proteins (ECM) at the nanometer scale and the biophysical interactions between the ECM solid structure and interstitial fluid at the micrometer scale. The latter was modeled using the poroelasticity theory with an assumption of free motion of the interstitial fluid within the porous ECM structure. Following a recent study (Heris, H.K., Miri, A.K., Tripathy, U., Barthelat, F., Mongeau, L., 2013. Journal of the Mechanical Behavior of Biomedical Materials), atomic force microscopy was used to perform creep loading and 50-nm sinusoidal oscillations on porcine vocal folds. The proposed model was calibrated by a finite element model to accurately predict the nanoscale viscoelastic moduli of ECM. A linear correlation was observed between the in-depth distribution of the viscoelastic moduli and that of hyaluronic acids in the vocal fold tissue. We conclude that hyaluronic acids may regulate the vocal fold viscoelasticity at nanoscale. The proposed methodology offers a characterization tool for biomaterials used in vocal fold augmentations. PMID:24317493

  5. CGC/saturation approach for soft interactions at high energy: Inclusive production

    Directory of Open Access Journals (Sweden)

    E. Gotsman

    2015-06-01

    Full Text Available In this letter we demonstrate that our dipole model is successful in describing inclusive production within the same framework as diffractive physics. We believe that this achievement stems from the fact that our approach incorporates the positive features of the Reggeon approach and CGC/saturation effective theory, for high energy QCD.

  6. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...... linkages in type 2 diabetes (T2D) through genome-wide associations, disease similarities, and published empirical evidence. Ten environmental chemicals were found to be potentially linked to T2D, the highest scores were observed for arsenic, 2,3,7,8-tetrachlorodibenzo-p-dioxin, hexachlorobenzene......, and perfluorooctanoic acid. For these substances we integrated disease and pathway annotations on top of protein interactions to reveal possible pathogenetic pathways that deserve empirical testing. The approach is general and can address other public health concerns in addition to identifying diabetogenic chemicals...

  7. A Computational Drug Repositioning Approach for Targeting Oncogenic Transcription Factors

    OpenAIRE

    Gayvert, Kaitlyn; Dardenne, Etienne; Cheung, Cynthia; Boland, Mary Regina; Lorberbaum, Tal; Wanjala, Jackline; Chen, Yu; Rubin, Mark; Tatonetti, Nicholas P.; Rickman, David; Elemento, Olivier

    2016-01-01

    Mutations in transcription factors (TFs) genes are frequently observed in tumors, often leading to aberrant transcriptional activity. Unfortunately, TFs are often considered undruggable due to the absence of targetable enzymatic activity. To address this problem, we developed CRAFTT, a Computational drug-Repositioning Approach For Targeting Transcription factor activity. CRAFTT combines ChIP-seq with drug-induced expression profiling to identify small molecules that can specifically perturb T...

  8. A computational approach for the health care market.

    Science.gov (United States)

    Montefiori, Marcello; Resta, Marina

    2009-12-01

    In this work we analyze the market for health care through a computational approach that relies on Kohonen's Self-Organizing Maps, and we observe the competition dynamics of health care providers versus those of patients. As a result, we offer a new tool addressing the issue of hospital behaviour and demand mechanism modelling, which conjugates a robust theoretical implementation together with an instrument of deep graphical impact.

  9. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  10. Oral soft tissue infections: causes, therapeutic approaches and microbiological spectrum with focus on antibiotic treatment.

    Science.gov (United States)

    Götz, Carolin; Reinhart, Edeltraud; Wolff, Klaus-Dietrich; Kolk, Andreas

    2015-11-01

    Intraoral soft tissue infections (OSTI) are a common problem in dentistry and oral surgery. These abscesses are mostly exacerbated dental infections (OIDC), and some emerge as postoperative infections (POI) after tooth extraction (OITR) or apicoectomy (OIRR). The main aim of this study was to compare OIDC with POI, especially looking at the bacteria involved. An additional question was, therefore, if different antibiotic treatments should be used with OSTI of differing aetiologies. The impact of third molars on OSTI was evaluated and also the rates of POI after removal of third molars were specified. Patient data was collected from the patients' medical records and the results were statistically evaluated with SPSS (SPSS version 21.0; SPSS, IBM; Chicago, IL, USA). The inclusion criterion was the outpatient treatment of a patient with an exacerbated oral infection; the exclusion criteria were an early stage of infiltration without abscess formation; and a need for inpatient treatment. Periapical exacerbated infections, especially in the molar region were the commonest cause of OIDC. In the OITR group, mandibular tooth removal was the commonest factor (p=0.016). Remarkably, retained lower wisdom teeth led to significant number of cases in the OITR group (p=0.022). In our study we could not define differences between the causal bacteria found in patients with OIDC and POI. Due to resistance rates we conclude that amoxicillin combined with clavulanic acid seems to be the antibiotic standard for exacerbated intraoral infections independent of their aetiology. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  11. Cloud computing and public administration: approaches in several European countries

    Directory of Open Access Journals (Sweden)

    Zaharia-Rădulescu Adrian-Mihai

    2017-07-01

    Full Text Available The Digital Agenda for Europe 2020 has more objectives from increasing the participation of its citizens and consumers in the digital society to creating a fully interconnected society. These objectives can be supported through a high degree of digitization achieved at public administration level and an increased performance in delivering public services. Cloud computing is one of the ICT areas with a fast growth in the last years that presents a big promise for achieving the objectives of the Digital Agenda for Europe 2020. This paper aims to present what cloud computing is and how it can help the public administration to increase its performance. From Infrastructure as a Service continuing with Platform as a Service and moving up to Software as a Service each level of cloud computing is presented in here with its strong and weak points and how it is suitable for a given use case in public administration. The challenges and the risks of moving to cloud and the differences between public, private and hybrid cloud are also presented in the paper. The research done by the author is both theoretical and literature review and combines knowledge from different areas. An analysis and examples of cloud computing approach and implementation in several European Union countries are presented in this paper to facilitate the understanding of the subject. Cloud computing can help public administration to decrease costs, standardize services in different locations, integrate public resources and provide a higher transparency in the government act.

  12. NON-LINEAR MODEL PREDICTIVE CONTROL STRATEGIES FOR PROCESS PLANTS USING SOFT COMPUTING APPROACHES

    OpenAIRE

    Owa, Kayode Olayemi

    2014-01-01

    The developments of advanced non-linear control strategies have attracted a considerable research interests over the past decades especially in process control. Rather than an absolute reliance on mathematical models of process plants which often brings discrepancies especially owing to design errors and equipment degradation, non-linear models are however required because they provide improved prediction capabilities but they are very difficult to derive. In addition, the derivation of the g...

  13. Extraction, analysis and desaturation of gmelina seed oil using different soft computing approaches

    Directory of Open Access Journals (Sweden)

    F. Chigozie Uzoh

    2016-12-01

    Full Text Available Artificial Neural Network (ANN-Genetic Algorithm (GA interface and Response Surface Methodology (RSM have been compared as tools for simulation and optimization of gmelina seed oil extraction process. A multi-layer feed-forward Levenberg Marquardt back-propagation algorithm was incorporated for developing a predictive model which was optimized using GA. Design Expert simulation and optimization tools were also incorporated for a detailed simulation and optimization of the same process using Response surface methodology (RSM. It was found that oil yield increased with rise in temperature, time and volume of solvent but decreased with increase in seed particle size. The maximum oil yield obtained using the numerical optimization techniques show that 49.2% were predicted by the RSM at the optimum conditions of; 60 °C temperature, extraction time 60 min, 150 μm seed particle size, 150 ml solvent volume and 49.8% by ANN-GA at extraction temperature 40 °C, extraction time 40 min, 200 μm seed particle size, 100 ml solvent volume, respectively. The prediction accuracy of both models were more than 95%. Models validation experiments indicate that the predicted and the actual were in close agreement. The extract was analyzed to examine its physico-chemical properties (acid value, iodine value, peroxide value, viscosity, saponification value, moisture and ash content, refractive index, smoke, flash and fire points and specific gravity and structural elucidation by standard methods and instrumental techniques. Results revealed that the oil is non-drying and edible. Desaturation of the oil further reveal its potential in alkyd resin synthesis.

  14. Prediction of BP Reactivity to Talking Using Hybrid Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Gurmanik Kaur

    2014-01-01

    Full Text Available High blood pressure (BP is associated with an increased risk of cardiovascular diseases. Therefore, optimal precision in measurement of BP is appropriate in clinical and research studies. In this work, anthropometric characteristics including age, height, weight, body mass index (BMI, and arm circumference (AC were used as independent predictor variables for the prediction of BP reactivity to talking. Principal component analysis (PCA was fused with artificial neural network (ANN, adaptive neurofuzzy inference system (ANFIS, and least square-support vector machine (LS-SVM model to remove the multicollinearity effect among anthropometric predictor variables. The statistical tests in terms of coefficient of determination (R2, root mean square error (RMSE, and mean absolute percentage error (MAPE revealed that PCA based LS-SVM (PCA-LS-SVM model produced a more efficient prediction of BP reactivity as compared to other models. This assessment presents the importance and advantages posed by PCA fused prediction models for prediction of biological variables.

  15. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  16. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    Science.gov (United States)

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical

  17. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  18. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  19. A New Approach to Studying Biological and Soft Materials Using Focused Ion Beam Scanning Electron Microscopy (FIB SEM)

    International Nuclear Information System (INIS)

    Stokes, D J; Morrissey, F; Lich, B H

    2006-01-01

    Over the last decade techniques such as confocal light microscopy, in combination with fluorescent labelling, have helped biologists and life scientists to study biological architectures at tissue and cell level in great detail. Meanwhile, obtaining information at very small length scales is possible with the combination of sample preparation techniques and transmission electron microscopy (TEM) or scanning transmission electron microscopy (STEM). Scanning electron microscopy (SEM) is well known for the determination of surface characteristics and morphology. However, the desire to understand the three dimensional relationships of meso-scale hierarchies has led to the development of advanced microscopy techniques, to give a further complementary approach. A focused ion beam (FIB) can be used as a nano-scalpel and hence allows us to reveal internal microstructure in a site-specific manner. Whilst FIB instruments have been used to study and verify the three-dimensional architecture of man made materials, SEM and FIB technologies have now been brought together in a single instrument representing a powerful combination for the study of biological specimens and soft materials. We demonstrate the use of FIB SEM to study three-dimensional relationships for a range of length scales and materials, from small-scale cellular structures to the larger scale interactions between biomedical materials and tissues. FIB cutting of heterogeneous mixtures of hard and soft materials, resulting in a uniform cross-section, has proved to be of particular value since classical preparation methods tend to introduce artefacts. Furthermore, by appropriate selection, we can sequentially cross-section to create a series of 'slices' at specific intervals. 3D reconstruction software can then be used to volume-render information from the 2D slices, enabling us to immediately see the spatial relationships between microstructural components

  20. A fully digital approach to replicate peri-implant soft tissue contours and emergence profile in the esthetic zone.

    Science.gov (United States)

    Monaco, Carlo; Evangelisti, Edoardo; Scotti, Roberto; Mignani, Giuseppe; Zucchelli, Giovanni

    2016-12-01

    This short communication reports on a novel digital technique designated - the "Fully Digital Technique (FDT)" - to take the impression of the peri-implant soft tissue and emergence profile with an intraoral scanner, digitally capturing both the three dimensional position of the implant platform and the coronal and gingival parts of the provisional retained restoration. A first intraoral digital impression, which generated a standard triangulation language file (STL1), was taken using a standardized implant scanbody to detect the position of the implant. A second digital impression (STL2) with the provisional retained restoration in situ was performed in two steps: the first part of the scan captured all details of the vestibular and palatal sides of the provisional retained restoration and the adjacent teeth. The provisional retained restoration was then unscrewed, and the subgingival part of the restoration was scanned directly out of the mouth to determine its subgingival shape. STL1 and STL2 were imported into imaging software and superimposed using the "best fit" algorithm to achieve a new merged file (STL3) with the 3D implant position, the peri-implant mucosa, and emergence profile. The merged file was used to design the CAD/CAM customized abutment and to realize a stereolithographic model by 3D printing. The STL superimposition of digital impressions of the implant position and the provisional retained restoration constitute a novel technique to obtain a single STL file with the implant position and its peri-implant mucosal architecture. FDT is a rapid digital approach for achieving all information of the peri-implant soft tissue and emergence profile directly from the provisional retained restoration. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Fast reactor safety and computational thermo-fluid dynamics approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Shimizu, Takeshi

    1993-01-01

    This article provides a brief description of the safety principle on which liquid metal cooled fast breeder reactors (LMFBRs) is based and the roles of computations in the safety practices. A number of thermohydraulics models have been developed to date that successfully describe several of the important types of fluids and materials motion encountered in the analysis of postulated accidents in LMFBRs. Most of these models use a mixture of implicit and explicit numerical solution techniques in solving a set of conservation equations formulated in Eulerian coordinates, with special techniques included to specific situations. Typical computational thermo-fluid dynamics approaches are discussed in particular areas of analyses of the physical phenomena relevant to the fuel subassembly thermohydraulics design and that involve describing the motion of molten materials in the core over a large scale. (orig.)

  2. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  3. COMPUTATIONAL APPROACHES FOR RATIONAL DESIGN OF PROTEINS WITH NOVEL FUNCTIONALITIES

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  4. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  5. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  6. Analytical and computational approaches to define the Aspergillus niger secretome

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Adrian; Butler, Gregory D.; Powlowski, Justin; Panisko, Ellen A.; Baker, Scott E.

    2009-03-01

    We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome. The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used to guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.

  7. Soft engineering vs. a dynamic approach in coastal dune management: a case study on the North Sea barrier island of Ameland, the Netherlands

    NARCIS (Netherlands)

    Jong, de B.; Keijsers, J.G.S.; Riksen, M.J.P.M.; Krol, J.; Slim, P.A.

    2014-01-01

    Dunes act as flood defences in coastal zones, protecting low-lying interior lands from flooding. To ensure coastal safety, insight is needed on how dunes develop under different types of management. The current study focuses on two types of coastal dune management: (1) a “soft engineering” approach,

  8. Applications of Soft Union Sets in the Ring Theory

    Directory of Open Access Journals (Sweden)

    Yongwei Yang

    2013-01-01

    through discussing quotient soft subsets, an approach for constructing quotient soft union rings is made. Finally, isomorphism theorems of λ,μ-soft union rings related to invariant soft sets are discussed.

  9. Approaching multiphase flows from the perspective of computational fluid dynamics

    International Nuclear Information System (INIS)

    Banas, A.O.

    1992-01-01

    Thermalhydraulic simulation methodologies based on subchannel and porous-medium concepts are briefly reviewed and contrasted with the general approach of Computational Fluid Dynamics (CFD). An outline of the advanced CFD methods for single-phase turbulent flows is followed by a short discussion of the unified formulation of averaged equations for turbulent and multiphase flows. Some of the recent applications of CFD at Chalk River Laboratories are discussed, and the complementary role of CFD with regard to the established thermalhydraulic methods of analysis is indicated. (author). 8 refs

  10. A pencil beam approach to proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Rescigno, Regina, E-mail: regina.rescigno@iphc.cnrs.fr; Bopp, Cécile; Rousseau, Marc; Brasse, David [Université de Strasbourg, IPHC, 23 rue du Loess, Strasbourg 67037, France and CNRS, UMR7178, Strasbourg 67037 (France)

    2015-11-15

    Purpose: A new approach to proton computed tomography (pCT) is presented. In this approach, protons are not tracked one-by-one but a beam of particles is considered instead. The elements of the pCT reconstruction problem (residual energy and path) are redefined on the basis of this new approach. An analytical image reconstruction algorithm applicable to this scenario is also proposed. Methods: The pencil beam (PB) and its propagation in matter were modeled by making use of the generalization of the Fermi–Eyges theory to account for multiple Coulomb scattering (MCS). This model was integrated into the pCT reconstruction problem, allowing the definition of the mean beam path concept similar to the most likely path (MLP) used in the single-particle approach. A numerical validation of the model was performed. The algorithm of filtered backprojection along MLPs was adapted to the beam-by-beam approach. The acquisition of a perfect proton scan was simulated and the data were used to reconstruct images of the relative stopping power of the phantom with the single-proton and beam-by-beam approaches. The resulting images were compared in a qualitative way. Results: The parameters of the modeled PB (mean and spread) were compared to Monte Carlo results in order to validate the model. For a water target, good agreement was found for the mean value of the distributions. As far as the spread is concerned, depth-dependent discrepancies as large as 2%–3% were found. For a heterogeneous phantom, discrepancies in the distribution spread ranged from 6% to 8%. The image reconstructed with the beam-by-beam approach showed a high level of noise compared to the one reconstructed with the classical approach. Conclusions: The PB approach to proton imaging may allow technical challenges imposed by the current proton-by-proton method to be overcome. In this framework, an analytical algorithm is proposed. Further work will involve a detailed study of the performances and limitations of

  11. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  12. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  13. Glass transition of soft colloids

    Science.gov (United States)

    Philippe, Adrian-Marie; Truzzolillo, Domenico; Galvan-Myoshi, Julian; Dieudonné-George, Philippe; Trappe, Véronique; Berthier, Ludovic; Cipelletti, Luca

    2018-04-01

    We explore the glassy dynamics of soft colloids using microgels and charged particles interacting by steric and screened Coulomb interactions, respectively. In the supercooled regime, the structural relaxation time τα of both systems grows steeply with volume fraction, reminiscent of the behavior of colloidal hard spheres. Computer simulations confirm that the growth of τα on approaching the glass transition is independent of particle softness. By contrast, softness becomes relevant at very large packing fractions when the system falls out of equilibrium. In this nonequilibrium regime, τα depends surprisingly weakly on packing fraction, and time correlation functions exhibit a compressed exponential decay consistent with stress-driven relaxation. The transition to this novel regime coincides with the onset of an anomalous decrease in local order with increasing density typical of ultrasoft systems. We propose that these peculiar dynamics results from the combination of the nonequilibrium aging dynamics expected in the glassy state and the tendency of colloids interacting through soft potentials to refluidize at high packing fractions.

  14. Error characterization for asynchronous computations: Proxy equation approach

    Science.gov (United States)

    Sallai, Gabriella; Mittal, Ankita; Girimaji, Sharath

    2017-11-01

    Numerical techniques for asynchronous fluid flow simulations are currently under development to enable efficient utilization of massively parallel computers. These numerical approaches attempt to accurately solve time evolution of transport equations using spatial information at different time levels. The truncation error of asynchronous methods can be divided into two parts: delay dependent (EA) or asynchronous error and delay independent (ES) or synchronous error. The focus of this study is a specific asynchronous error mitigation technique called proxy-equation approach. The aim of this study is to examine these errors as a function of the characteristic wavelength of the solution. Mitigation of asynchronous effects requires that the asynchronous error be smaller than synchronous truncation error. For a simple convection-diffusion equation, proxy-equation error analysis identifies critical initial wave-number, λc. At smaller wave numbers, synchronous error are larger than asynchronous errors. We examine various approaches to increase the value of λc in order to improve the range of applicability of proxy-equation approach.

  15. Computational Approaches for Translational Oncology: Concepts and Patents.

    Science.gov (United States)

    Scianna, Marco; Munaron, Luca

    2016-01-01

    Cancer is a heterogeneous disease, which is based on an intricate network of processes at different spatiotemporal scales, from the genome to the tissue level. Hence the necessity for the biomedical and pharmaceutical research to work in a multiscale fashion. In this respect, a significant help derives from the collaboration with theoretical sciences. Mathematical models can in fact provide insights into tumor-related processes and support clinical oncologists in the design of treatment regime, dosage, schedule and toxicity. The main objective of this article is to review the recent computational-based patents which tackle some relevant aspects of tumor treatment. We first analyze a series of patents concerning the purposing the purposing or repurposing of anti-tumor compounds. These approaches rely on pharmacokinetics and pharmacodynamics modules, that incorporate data obtained in the different phases of clinical trials. Similar methods are also at the basis of other patents included in this paper, which deal with treatment optimization, in terms of maximizing therapy efficacy while minimizing side effects on the host. A group of patents predicting drug response and tumor evolution by the use of kinetics graphs are commented as well. We finally focus on patents that implement informatics tools to map and screen biological, medical, and pharmaceutical knowledge. Despite promising aspects (and an increasing amount of the relative literature), we found few computational-based patents: there is still a significant effort to do for allowing modelling approaches to become an integral component of the pharmaceutical research.

  16. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  17. Augmenting Tertiary Students' Soft Skills Via Multiple Intelligences Instructional Approach: Literature Courses in Focus

    Directory of Open Access Journals (Sweden)

    El Sherief Eman

    2017-01-01

    Full Text Available The second half of the twentieth century is a witness to an unprecedentedly soaring increase in the number of students joining the arena of higher education(UNESCO,2001. Currently, the number of students at Saudi universities and colleges exceeds one million vis-à-vis 7000 in 1970(Royal Embassy of Saudi Arabia, Washington. Such enormous body of learners in higher education is per se diverse enough to embrace distinct learning styles, assorted repertoire of backgrounds, prior knowledge, experiences, and perspectives; at this juncture, they presumably share common aspiration which is hooking a compatible post in the labor market upon graduation, and to subsequently be capable of acting competently in a scrupulously competitive workplace environment. Bunch of potentialities and skills are patently vital for a graduate to reach such a prospect. Such bunch of skills in a conventional undergraduate paradigm of education were given no heed, being rather postponed to the post-graduation phase. The current Paper postulated tremendous  merits of deploying the Multiple Intelligences theory as a project-based approach, within  literature classes in higher education; a strategy geared towards reigniting students’ engagement, nurturing their critical thinking capabilities, sustaining their individualistic dispositions, molding them as inquiry-seekers, and ending up engendering life-long, autonomous learners,  well-armed with the substantial skills for traversing the rigorous competition in future labor market.

  18. Theoretical modeling of electroosmotic flow in soft microchannels: A variational approach applied to the rectangular geometry

    Science.gov (United States)

    Sadeghi, Arman

    2018-03-01

    Modeling of fluid flow in polyelectrolyte layer (PEL)-grafted microchannels is challenging due to their two-layer nature. Hence, the pertinent studies are limited only to circular and slit geometries for which matching the solutions for inside and outside the PEL is simple. In this paper, a simple variational-based approach is presented for the modeling of fully developed electroosmotic flow in PEL-grafted microchannels by which the whole fluidic area is considered as a single porous medium of variable properties. The model is capable of being applied to microchannels of a complex cross-sectional area. As an application of the method, it is applied to a rectangular microchannel of uniform PEL properties. It is shown that modeling a rectangular channel as a slit may lead to considerable overestimation of the mean velocity especially when both the PEL and electric double layer (EDL) are thick. It is also demonstrated that the mean velocity is an increasing function of the fixed charge density and PEL thickness and a decreasing function of the EDL thickness and PEL friction coefficient. The influence of the PEL thickness on the mean velocity, however, vanishes when both the PEL thickness and friction coefficient are sufficiently high.

  19. Limb sparing approach: Adjuvant radiation therapy in adults with intermediate or high-grade limb soft tissue sarcoma

    International Nuclear Information System (INIS)

    Merimsky, Ofer; Soyfer, Vjacheslav; Kovner, Felix; Bickels, Jacob; Issakov, Josephine; Flusser, Gideon; Meller, Isaac; Ofer, Oded; Kollender, Yehuda

    2005-01-01

    Background: Limb soft tissue sarcomas (STS) are currently treated with limb sparing surgery (LSS) followed by radiation therapy (RT). Patients and methods: Between October 1994 and October 2002, 133 adult patients with intermediate or high-grade limb STS were approached by LSS+RT. Results: RT related toxicity was manageable, with a low rate of severe effects. At 4-year median follow-up, there were 48 recurrences of any type, 23 of isolated local failure, and 35 of systemic spread w/o local failure. DFS and OS were influenced by disease stage II vs I, primary site in the upper limb vs lower limb, MPNST vs other types, induction therapy vs no induction, adequate resection vs marginal resection or involved margins, and good response to induction therapy vs bad response. DFS and OS were Patient's age and sex, tumor depth, acute or late toxicity of RT, or the interval of time between the date of definitive surgery and the start of RT did not affect DFS and or OS. Conclusions: The RT protocol is applicable in the era of complicated, expensive and time-consuming 3D therapy. Our results of LSS+RT in adults with limb HG STS are satisfactory

  20. PEDOT coated Li4Ti5O12 nanorods: Soft chemistry approach synthesis and their lithium storage properties

    International Nuclear Information System (INIS)

    Wang, Xiaoyan; Shen, Laifa; Li, Hongsen; Wang, Jie; Dou, Hui; Zhang, Xiaogang

    2014-01-01

    Spinel Li 4 Ti 5 O 12 nanorods coated with poly (3,4-ethylenedioxythiophene) (PEDOT) layer as an anode material for lithium ions battery were synthesized by a facile soft chemistry approach. The highly conductive and uniform PEDOT layer coated on the surface of Li 4 Ti 5 O 12 nanorods significantly improves the electrochemical performance of the composite, which exhibits higher reversible capacity and better rate capability compared with the pure Li 4 Ti 5 O 12 and Li 4 Ti 5 O 12 /C composite. The reversible capacity of Li 4 Ti 5 O 12 /PEDOT nanorods can be up to 171.5 mA h g −1 at the rate of 0.2 C. And a capacity of 168.7 mA h g −1 was retained with only 0.5% capacity loss after 100 charge-discharge cycles at the rate of 1 C, which confirms the good cycling behavior of Li 4 Ti 5 O 12 /PEDOT nanorods. The superior electrochemical performance of the Li 4 Ti 5 O 12 /PEDOT nanorods could be attributed to the one-dimensional (1D) morphology and uniform conducting polymer layer, which shortens the lithium-ion diffusion path and improves the electrical conductivity of Li 4 Ti 5 O 12

  1. Soft matter assemblies as nanomedicine platforms for cancer chemotherapy: a journey from market products towards novel approaches.

    Science.gov (United States)

    Jäger, Eliézer; Giacomelli, Fernando C

    2015-01-01

    The current review aims to outline the likely medical applications of nanotechnology and the potential of the emerging field of nanomedicine. Nanomedicine can be defined as the investigation area encompassing the design of diagnostics and therapeutics at the nanoscale, including nanobots, nanobiosensors, nanoparticles and other nanodevices, for the remediation, prevention and diagnosis of a variety of illnesses. The ultimate goal of nanomedicine is to improve patient quality-of-life. Because nanomedicine includes the rational design of an enormous number of nanotechnology-based products focused on miscellaneous diseases, a variety of nanomaterials can be employed. Therefore, this review will focus on recent advances in the manufacture of soft matterbased nanomedicines specifically designed to improve diagnostics and cancer chemotherapy efficacy. It will be particularly highlighted liposomes, polymer-drug conjugates, drug-loaded block copolymer micelles and biodegradable polymeric nanoparticles, emphasizing the current investigations and potential novel approaches towards overcoming the remaining challenges in the field as well as formulations that are in clinical trials and marketed products.

  2. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  3. An Innovative Approach to Evaluate the Morphological Patterns of Soft Palate in Oral Submucous Fibrosis Patients: A Digital Cephalometric Study

    Directory of Open Access Journals (Sweden)

    Chintamaneni Raja Lakshmi

    2016-01-01

    Full Text Available Oral submucous fibrosis (OSMF is a chronic insidious disease affecting mucosa and submucosa of oral cavity and soft palate. The present study aimed to evaluate the morphology of soft palate in normal individuals and OSMF patients using lateral cephalometry and to compare and correlate these variants of soft palate with different stages of OSMF. 100 subjects were included in the study, who were divided into two groups. Group I included 50 subjects with clinical diagnosis of OSMF and Group II included 50 normal subjects (control group. Using digital lateral cephalometry, velar length and width were measured and soft palatal patterns were categorized based on You et al.’s classification. Leaf and rat-tail patterns of soft palate were predominant in control group, whereas butt and crook shaped variants were more in study group. Anteroposterior (A-P length of soft palate was significantly greater in stage I OSMF, while superoinferior (S-I width was greater in stage III OSMF. Interestingly, a negative correlation was observed in staging of OSMF and A-P dimensions. As the staging of OSMF advances, the A-P length of soft palate decreases, but S-I width increases.

  4. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  5. Holiday fun with soft gluons

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Emissions of soft gluons from energetic particles play an important role in collider processes. While the basic physics of soft emissions is simple, it gives rise to a variety of interesting and intricate phenomena (non-global logs, Glauber phases, super-leading logs, factorization breaking). After an introduction, I will review progress in resummation methods such as Soft-Collinear Effective Theory driven by a better understanding of soft emissions. I will also show some new results for computations of soft-gluon effects in gap-between-jets and isolation-cone cross sections.

  6. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  7. A Computational Drug Repositioning Approach for Targeting Oncogenic Transcription Factors

    Directory of Open Access Journals (Sweden)

    Kaitlyn M. Gayvert

    2016-06-01

    Full Text Available Mutations in transcription factor (TF genes are frequently observed in tumors, often leading to aberrant transcriptional activity. Unfortunately, TFs are often considered undruggable due to the absence of targetable enzymatic activity. To address this problem, we developed CRAFTT, a computational drug-repositioning approach for targeting TF activity. CRAFTT combines ChIP-seq with drug-induced expression profiling to identify small molecules that can specifically perturb TF activity. Application to ENCODE ChIP-seq datasets revealed known drug-TF interactions, and a global drug-protein network analysis supported these predictions. Application of CRAFTT to ERG, a pro-invasive, frequently overexpressed oncogenic TF, predicted that dexamethasone would inhibit ERG activity. Dexamethasone significantly decreased cell invasion and migration in an ERG-dependent manner. Furthermore, analysis of electronic medical record data indicates a protective role for dexamethasone against prostate cancer. Altogether, our method provides a broadly applicable strategy for identifying drugs that specifically modulate TF activity.

  8. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  9. Towards a Resource Reservation Approach for an Opportunistic Computing Environment

    International Nuclear Information System (INIS)

    Gomes, Eliza; Dantas, M A R

    2014-01-01

    Advanced reservation has been used in grid environments to provide quality of service (QoS) and to guarantee resources available at the execution time. However, in grid subtypes, such as opportunistic grid computing, it is a challenge provides QoS and guarantee of availability resources. In this article, we propose a new advanced reservation approach which offers to a user the possibility to select resources in advance for a future utilization. Therefore, the main goal of this proposal is to offer a best effort feature to a user from an opportunistic configuration. In these types of environments, it is not possible to provide QoS, because, usually, there are no guarantees of resources availability and, consequently, the execution of users applications. In addition, this research work provides a way to organize executions, what it can improve the scheduling and system operations. Experimental results, carried out through a case study, shown the efficiency and relevance of our proposal

  10. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  11. Continuous stacking computational approach based automated microscope slide scanner

    Science.gov (United States)

    Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva

    2018-02-01

    Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.

  12. A computational approach for deciphering the organization of glycosaminoglycans.

    Directory of Open Access Journals (Sweden)

    Jean L Spencer

    2010-02-01

    Full Text Available Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers.To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS. Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts.This model establishes the conceptual framework for a new class of computational tools to use to assess patterns of domain organization

  13. A computational approach to climate science education with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  14. A flexible approach to client-server computing.

    Science.gov (United States)

    van Mulligen, E M

    1995-01-01

    The number of standards for client-server computing almost equals the number of client-server applications. Although there are some standardization efforts, there is hardly any experience with applying these solutions to (medical) practice. The layering approach of the integration architecture HERMES anticipates the inclusion of commercial standards (when available and evaluated); it currently supports already the development of client-server applications. The toolkit provides support for the development of the communication between client and server through a callback mechanism. Stubs created with the HERMES toolkit contain a number of built-in callbacks that manage sessions between clients and services. An important feature of HERMES is the ability to include existing legacy systems as if they are true open services. In this way, the growth path from stand-alone to client-server computing can be shortened and the implementation of the client-server architecture can begin immediately. Moreover, the existing legacy systems remain available as a stand-alone solution for daily practice. Clients and services are connected through the HERMES kernel. This kernel uses a database to find the best server match for a request from a client. Moreover, it completes requests with mandatory data and tries to optimize performance. Special features are included to minimize the memory burden of the session-oriented client-server model. Currently, a system for the outpatient clinic cardiology, occupational health care, and clinical data analysis is available.

  15. Computational approach in estimating the need of ditch network maintenance

    Science.gov (United States)

    Lauren, Ari; Hökkä, Hannu; Launiainen, Samuli; Palviainen, Marjo; Repo, Tapani; Leena, Finer; Piirainen, Sirpa

    2015-04-01

    Ditch network maintenance (DNM), implemented annually in 70 000 ha area in Finland, is the most controversial of all forest management practices. Nationwide, it is estimated to increase the forest growth by 1…3 million m3 per year, but simultaneously to cause 65 000 tons export of suspended solids and 71 tons of phosphorus (P) to water courses. A systematic approach that allows simultaneous quantification of the positive and negative effects of DNM is required. Excess water in the rooting zone slows the gas exchange and decreases biological activity interfering with the forest growth in boreal forested peatlands. DNM is needed when: 1) the excess water in the rooting zone restricts the forest growth before the DNM, and 2) after the DNM the growth restriction ceases or decreases, and 3) the benefits of DNM are greater than the caused adverse effects. Aeration in the rooting zone can be used as a drainage criterion. Aeration is affected by several factors such as meteorological conditions, tree stand properties, hydraulic properties of peat, ditch depth, and ditch spacing. We developed a 2-dimensional DNM simulator that allows the user to adjust these factors and to evaluate their effect on the soil aeration at different distance from the drainage ditch. DNM simulator computes hydrological processes and soil aeration along a water flowpath between two ditches. Applying daily time step it calculates evapotranspiration, snow accumulation and melt, infiltration, soil water storage, ground water level, soil water content, air-filled porosity and runoff. The model performance in hydrology has been tested against independent high frequency field monitoring data. Soil aeration at different distance from the ditch is computed under steady-state assumption using an empirical oxygen consumption model, simulated air-filled porosity, and diffusion coefficient at different depths in soil. Aeration is adequate and forest growth rate is not limited by poor aeration if the

  16. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  17. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  18. Soft Connected Spaces and Soft Paracompact Spaces

    OpenAIRE

    Fucai Lin

    2013-01-01

    Soft topological spaces are considered as mathematical tools for dealing with uncertainties, and a fuzzy topological space is a special case of the soft topological space. The purpose of this paper is to study soft topological spaces. We introduce some new concepts in soft topological spaces such as soft closed mapping, soft open mappings, soft connected spaces and soft paracompact spaces. We also redefine the concept of soft points such that it is reasonable in soft topological spaces. Mo...

  19. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  20. Achieving high-resolution soft-tissue imaging with cone-beam CT: a two-pronged approach for modulation of x-ray fluence and detector gain

    Science.gov (United States)

    Graham, S. A.; Siewerdsen, J. H.; Moseley, D. J.; Keller, H.; Shkumat, N. A.; Jaffray, D. A.

    2005-04-01

    Cone-beam computed tomography (CBCT) presents a highly promising and challenging advanced application of flat-panel detectors (FPDs). The great advantage of this adaptable technology is in the potential for sub-mm 3D spatial resolution in combination with soft-tissue detectability. While the former is achieved naturally by CBCT systems incorporating modern FPD designs (e.g., 200 - 400 um pixel pitch), the latter presents a significant challenge due to limitations in FPD dynamic range, large field of view, and elevated levels of x-ray scatter in typical CBCT configurations. We are investigating a two-pronged strategy to maximizing soft-tissue detectability in CBCT: 1) front-end solutions, including novel beam modulation designs (viz., spatially varying compensators) that alleviate detector dynamic range requirements, reduce x-ray scatter, and better distribute imaging dose in a manner suited to soft-tissue visualization throughout the field of view; and 2) back-end solutions, including implementation of an advanced FPD design (Varian PaxScan 4030CB) that features dual-gain and dynamic gain switching that effectively extends detector dynamic range to 18 bits. These strategies are explored quantitatively on CBCT imaging platforms developed in our laboratory, including a dedicated CBCT bench and a mobile isocentric C-arm (Siemens PowerMobil). Pre-clinical evaluation of improved soft-tissue visibility was carried out in phantom and patient imaging with the C-arm device. Incorporation of these strategies begin to reveal the full potential of CBCT for soft-tissue visualization, an essential step in realizing broad utility of this adaptable technology for diagnostic and image-guided procedures.

  1. Environmental Informatics and Soft Computing Paradigm: Processing of Cocos Nucifera Shell Derived Activated Carbon for Treatment of Distillery Spent Wash—A Solution to Environmental Issue

    Directory of Open Access Journals (Sweden)

    N. B. Raut

    2014-01-01

    Full Text Available Soft computing techniques are very much needed to design the environmental related systems these days. Soft computing (SC is a set of computational methods that attempt to determine satisfactory approximate solutions to find a model for real-world problems. Techniques such as artificial neural networks, fuzzy logic, and genetic algorithms can be used in solving complex environmental problems. Self-organizing feature map (SOFM model is proposed in monitoring and collecting of the data that are real time and static datasets acquired through pollution monitoring sensors and stations in the distilleries. In the environmental monitoring systems the ultimate requirement is to establish controls for the sensor based data acquisition systems and needs interactive and dynamic reporting services. SOFM techniques are used for data analysis and processing. The processed data is used for control system which even feeds to the treatment systems. Cocos nucifera activated carbon commonly known as coconut shell activated carbon (CSC was utilized for the treatment of distillery spent wash. Batch and column studies were done to investigate the kinetics and effect of operating parameter on the rate of adsorption. Since the quantum of spent water generated from the sugar industry allied distillery units is huge, this low cost adsorbent is found to be an attractive economic option. Equilibrium adsorption date was generated to plot Langmuir and Tempkin adsorption isotherm. The investigation reveals that though with lower adsorption capacities CSC seems to be technically feasible solution for treating sugar distillery spent. Efforts are made in this paper to build informatics for derived activated carbon for solving the problem of treatment of distillery spent wash. Capsule. Coconut shell derived activated carbon was synthesized, characterized, and successfully employed as a low cost adsorbent for treatment of distillery spent wash.

  2. Soft leptogenesis

    International Nuclear Information System (INIS)

    D'Ambrosio, Giancarlo; Giudice, Gian F.; Raidal, Martti

    2003-01-01

    We study 'soft leptogenesis', a new mechanism of leptogenesis which does not require flavour mixing among the right-handed neutrinos. Supersymmetry soft-breaking terms give a small mass splitting between the CP-even and CP-odd right-handed sneutrino states of a single generation and provide a CP-violating phase sufficient to generate a lepton asymmetry. The mechanism is successful if the lepton-violating soft bilinear coupling is unconventionally (but not unnaturally) small. The values of the right-handed neutrino masses predicted by soft leptogenesis can be low enough to evade the cosmological gravitino problem

  3. Soft morphological image resizing

    Science.gov (United States)

    Maltseff, Pavel A.

    1997-04-01

    One important problem in computer vision and image processing is image resizing. Current techniques are generally based on different interpolation methods. These methods are convenient but the downsampled or upsampled image will include new gray values which are not present in the original image. Soft morphological interpolation is a new technique for resampling discrete data. The soft morphological operations are an alternative to the standard morphological operation. The generic description of hierarchical soft morphological transformations was done previously. The further development of soft morphological operations by a hierarchical structural system uses the relaxation of the requirement that the result of the operation must be the r-th largest or smallest value of the corresponding multiset, where r is an order index of the internal hard center. We will assume that any reasonable integer value is acceptable. The purpose of this paper is to derive the sot morphological convolution and compare the result of this convolution with the cubic convolution and Gaussian pyramid.

  4. Liu Na’ou And The 1930s Soft Film Movement: A New Approach In Revisionist Chinese Film Historiography

    Directory of Open Access Journals (Sweden)

    Donna Ong

    2013-06-01

    Full Text Available This essay analyzes how the 1930s Chinese “Soft Film” movement emerged and developed in film historiography, and finds it is a discursive formation by the Leftists to create an ideological enemy that serves to define its own group’s identity through a struggle against an “other”. It challenges the naming of “Soft Film” through examining documents beyond the official archive. Unearthing the film writings of Liu Na’ou as the movement’s leading figure is a good entry point into excavating the history of the people and films associated with the label “Soft Film”. Reconstructing this “reactionary cinema” will reveal previously unknown cultural connections with classical and avant-garde Western film theories, and more importantly renovate the established Chinese film canon of the 1930s.

  5. Comparison of distal soft-tissue procedures combined with a distal chevron osteotomy for moderate to severe hallux valgus: first web-space versus transarticular approach.

    Science.gov (United States)

    Park, Yu-Bok; Lee, Keun-Bae; Kim, Sung-Kyu; Seon, Jong-Keun; Lee, Jun-Young

    2013-11-06

    There are two surgical approaches for distal soft-tissue procedures for the correction of hallux valgus-the dorsal first web-space approach, and the medial transarticular approach. The purpose of this study was to compare the outcomes achieved after use of either of these approaches combined with a distal chevron osteotomy in patients with moderate to severe hallux valgus. One hundred and twenty-two female patients (122 feet) who underwent a distal chevron osteotomy as part of a distal soft-tissue procedure for the treatment of symptomatic unilateral moderate to severe hallux valgus constituted the study cohort. The 122 feet were randomly divided into two groups: namely, a dorsal first web-space approach (group D; sixty feet) and a medial transarticular approach (group M; sixty-two feet). The clinical and radiographic results of the two groups were compared at a mean follow-up time of thirty-eight months. The American Orthopaedic Foot & Ankle Society (AOFAS) hindfoot scale hallux metatarsophalangeal-interphalangeal scores improved from a mean and standard deviation of 55.5 ± 12.8 points preoperatively to 93.5 ± 6.3 points at the final follow-up in group D and from 54.9 ± 12.6 points preoperatively to 93.6 ± 6.2 points at the final follow-up in group M. The mean hallux valgus angle in groups D and M was reduced from 32.2° ± 6.3° and 33.1° ± 8.4° preoperatively to 10.5° ± 5.5° and 9.9° ± 5.5°, respectively, at the time of final follow-up. The mean first intermetatarsal angle in groups D and M was reduced from 15.0° ± 2.8° and 15.3° ± 2.7° preoperatively to 6.5° ± 2.2° and 6.3° ± 2.4°, respectively, at the final follow-up. The clinical and radiographic outcomes were not significantly different between the two groups. The final clinical and radiographic outcomes between the two approaches for distal soft-tissue procedures were comparable and equally successful. Accordingly, the results of this study suggest that the medial transarticular

  6. Liu Na’ou And The 1930s Soft Film Movement: A New Approach In Revisionist Chinese Film Historiography

    OpenAIRE

    Donna Ong

    2013-01-01

    This essay analyzes how the 1930s Chinese “Soft Film” movement emerged and developed in film historiography, and finds it is a discursive formation by the Leftists to create an ideological enemy that serves to define its own group’s identity through a struggle against an “other”. It challenges the naming of “Soft Film” through examining documents beyond the official archive. Unearthing the film writings of Liu Na’ou as the movement’s leading figure is a good entry point into excavating the hi...

  7. Teaching Pervasive Computing to CS Freshmen: A Multidisciplinary Approach

    NARCIS (Netherlands)

    Silvis-Cividjian, Natalia

    2015-01-01

    Pervasive Computing is a growing area in research and commercial reality. Despite this extensive growth, there is no clear consensus on how and when to teach it to students. We report on an innovative attempt to teach this subject to first year Computer Science students. Our course combines computer

  8. Computational Approaches for Modeling the Multiphysics in Pultrusion Process

    Directory of Open Access Journals (Sweden)

    P. Carlone

    2013-01-01

    Full Text Available Pultrusion is a continuous manufacturing process used to produce high strength composite profiles with constant cross section. The mutual interactions between heat transfer, resin flow and cure reaction, variation in the material properties, and stress/distortion evolutions strongly affect the process dynamics together with the mechanical properties and the geometrical precision of the final product. In the present work, pultrusion process simulations are performed for a unidirectional (UD graphite/epoxy composite rod including several processing physics, such as fluid flow, heat transfer, chemical reaction, and solid mechanics. The pressure increase and the resin flow at the tapered inlet of the die are calculated by means of a computational fluid dynamics (CFD finite volume model. Several models, based on different homogenization levels and solution schemes, are proposed and compared for the evaluation of the temperature and the degree of cure distributions inside the heating die and at the postdie region. The transient stresses, distortions, and pull force are predicted using a sequentially coupled three-dimensional (3D thermochemical analysis together with a 2D plane strain mechanical analysis using the finite element method and compared with results obtained from a semianalytical approach.

  9. On Soft Biometrics

    DEFF Research Database (Denmark)

    Nixon, Mark; Correia, Paulo; Nasrollahi, Kamal

    2015-01-01

    Innovation has formed much of the rich history in biometrics. The field of soft biometrics was originally aimed to augment the recognition process by fusion of metrics that were sufficient to discriminate populations rather than individuals. This was later refined to use measures that could be used...... to discriminate individuals, especially using descriptions that can be perceived using human vision and in surveillance imagery. A further branch of this new field concerns approaches to estimate soft biometrics, either using conventional biometrics approaches or just from images alone. These three strands...... combine to form what is now known as soft biometrics. We survey the achievements that have been made in recognition by and in estimation of these parameters, describing how these approaches can be used and where they might lead to. The approaches lead to a new type of recognition, and one similar...

  10. A comparison of monthly precipitation point estimates at 6 locations in Iran using integration of soft computing methods and GARCH time series model

    Science.gov (United States)

    Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan

    2017-11-01

    Precipitation plays an important role in determining the climate of a region. Precise estimation of precipitation is required to manage and plan water resources, as well as other related applications such as hydrology, climatology, meteorology and agriculture. Time series of hydrologic variables such as precipitation are composed of deterministic and stochastic parts. Despite this fact, the stochastic part of the precipitation data is not usually considered in modeling of precipitation process. As an innovation, the present study introduces three new hybrid models by integrating soft computing methods including multivariate adaptive regression splines (MARS), Bayesian networks (BN) and gene expression programming (GEP) with a time series model, namely generalized autoregressive conditional heteroscedasticity (GARCH) for modeling of the monthly precipitation. For this purpose, the deterministic (obtained by soft computing methods) and stochastic (obtained by GARCH time series model) parts are combined with each other. To carry out this research, monthly precipitation data of Babolsar, Bandar Anzali, Gorgan, Ramsar, Tehran and Urmia stations with different climates in Iran were used during the period of 1965-2014. Root mean square error (RMSE), relative root mean square error (RRMSE), mean absolute error (MAE) and determination coefficient (R2) were employed to evaluate the performance of conventional/single MARS, BN and GEP, as well as the proposed MARS-GARCH, BN-GARCH and GEP-GARCH hybrid models. It was found that the proposed novel models are more precise than single MARS, BN and GEP models. Overall, MARS-GARCH and BN-GARCH models yielded better accuracy than GEP-GARCH. The results of the present study confirmed the suitability of proposed methodology for precise modeling of precipitation.

  11. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  12. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  13. Soft systems methodology as a potential approach to understanding non-motorised transport users in South Africa

    CSIR Research Space (South Africa)

    Van Rooyen, CE

    2016-07-01

    Full Text Available of this paper is to show the potential of using systems thinking and more particularly Soft Systems Methodology (SSM) as a practical and beneficial instrument that will guide BEPDPs with the ongoing learning process of understanding NMT users and their specific...

  14. Interdisciplinary approach to enhance the esthetics of maxillary anterior region using soft- and hard-tissue ridge augmentation in conjunction with a fixed partial prosthesis

    Directory of Open Access Journals (Sweden)

    Shaleen Khetarpal

    2018-01-01

    Full Text Available Favorable esthetics is one of the most important treatment outcomes in dentistry, and to achieve this, interdisciplinary approaches are often required. Ridge deficiencies can be corrected for both, soft- and hard-tissue discrepancies. To overcome such defects, not only a variety of prosthetic options are at our disposal but also several periodontal plastic surgical techniques are available as well. Various techniques have been described and revised, over the year to correct ridge defects. For enhancing soft-tissue contours in the anterior region, the subepithelial connective tissue graft is the treatment of choice. A combination of alloplastic bone graft in adjunct to connective tissue graft optimizes ridge augmentation and minimizes defects. The present case report describes the use of vascular interpositional connective tissue graft in combination with alloplastic bone graft for correction of Seibert's Class III ridge deficiency followed by a fixed partial prosthesis to achieve a better esthetic outcome.

  15. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  16. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  17. Computing by Carving with P Systems. A First Approach

    OpenAIRE

    Sempere, José M.

    2008-01-01

    In this work, we propose a P system which carries out computing by carving. Computing by carving was proposed by Gh. P˘aun as a technique to generate formal languages which can even be non recursively enumerable. Hence, it can be considered a hypercomputational technique. Here, we propose a first scheme based on P systems in order to perform computing by carving any formal language. So, the paper shows indirectly that these systems, under certain assumptions, can be considered ...

  18. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    Science.gov (United States)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  19. An HCI Approach to Computing in the Real World

    Science.gov (United States)

    Yardi, Sarita; Krolikowski, Pamela; Marshall, Taneshia; Bruckman, Amy

    2008-01-01

    We describe the implementation of a six-week course to teach Human-Computer Interaction (HCI) to high school students. Our goal was to explore the potential of HCI in motivating students to pursue future studies in related computing fields. Participants in our course learned to make connections between the types of technology they use in their…

  20. Gesture Recognition by Computer Vision : An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  1. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  2. Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.

    Science.gov (United States)

    Yamauchi, Takashi; Xiao, Kunchen

    2017-11-13

    Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. © 2017 Cognitive Science Society, Inc.

  3. Implementation and Analysis for APR1400 Soft Control System

    International Nuclear Information System (INIS)

    2015-01-01

    Due to the rapid advancement of digital technology, the definite technical advantages of digital control system compared to analog control system are accelerating the implementation of advanced distributed digital control system in the nuclear power plant. One of the major advantages of digital control system is the capability of Soft Control System. The design of Soft Control System for Advanced Power Reactor 1400 (APR1400) plant of Man-Machine Interface System (MMIS) is based on full digital technologies to enhance reliability, operability and maintainability. Computer-based compact workstation has been adopted in the APR1400 Main Control Room (MCR) to provide convenient working environment. This paper introduces the approaches and methodologies of Soft Control System for the Advanced Control Room (ACR). This paper also explains major design features for operation and display of the Soft Control System and its implementation to cope with regulatory requirements. (authors)

  4. Computer aided display of multiple soft tissue anatomical surfaces for simultaneous structural and area-dose appreciation in 3D-radiationtherapy planning. 115

    International Nuclear Information System (INIS)

    Moore, C.J.; Mott, D.J.; Wilkinson, J.M.

    1987-01-01

    For radiotherapy applications a 3D display that includes soft tissues is required but the presentation of all anatomical structures is often unnecessary and is potentially confusing. A tumour volume and a small number of critical organs, usually embedded within other soft tissue anatomy, are likely to be all that can be clearly displayed when presented in a 3D format. The inclusion of dose data (in the form of isodose lines or surfaces) adds to the complication of any 3D display. A solution to this problem is to incorporate the presentation of dose distribution into the technique used to provide the illusion of 3D. This illusion can be provided by either depth cueing or by the hypothetical illumination of spatially defined object surfaces. The dose distribution from irradiation fields or, in the case of brachytherapy from radioactive sources, can be regarded as a source of illumination for tumour and critical organs. The intensity of illumination at any point on a tissue surface represents the dose at that point. Such an approach also allows the variation of dose over a given surface (and by extension, over the corresponding volume) to be quantified using histogram techniques. This may be of value in analysing and comparing techniques in which vulnerable tissue surfaces are irradiated. The planning of intracavitary treatments for cervical cancer is one application which might benefit from the display approach described above. Here the variation of dose over the mucosal surfaces of the bladder and the rectum is of particular interest, since dose related morbidity has often been reported following these treatments. 7 refs.; 8 figs

  5. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  6. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  7. Food Sustainable Model Development: An ANP Approach to Prioritize Sustainable Factors in the Romanian Natural Soft Drinks Industry Context

    Directory of Open Access Journals (Sweden)

    Răzvan Cătalin Dobrea

    2015-07-01

    Full Text Available The latest developments in natural soft drinks in the Romanian market signal significant changes in consumers’ perceptions of the sustainability concept. While the necessity of preserving natural resources and ensuring a decent level of healthiness seem to be steadily embraced by the Romanian society, the lack of long enough time series to acknowledge this shift render impossible a traditional econometric validation of these recent trends in economic thinking. The large number of European-funded projects for upgrading technology in the Romanian sector of natural soft drinks raises the question whether the learning by doing effect dispersed into the Romanian managers’ investment decision making from the perspective of both economic and food sustainability. This paper presents the construction and the evaluation of an Analytical Network Process (ANP market share model, which emerged from extended in-depth interviews with 10 managers of the main Romanian natural soft drinks producers. This model differs from traditional market share ANP ones since concepts like either food of economic sustainability were considered as significant driving factors. The coincidence between the estimated market share and the actual one, expressed by Saaty’s compatibility index, validate this model and offer comparative numerical weights’ of importance for food or economic sustainability.

  8. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  9. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  10. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  11. Soft Robotics.

    Science.gov (United States)

    Whitesides, George M

    2018-04-09

    This description of "soft robotics" is not intended to be a conventional review, in the sense of a comprehensive technical summary of a developing field. Rather, its objective is to describe soft robotics as a new field-one that offers opportunities to chemists and materials scientists who like to make "things" and to work with macroscopic objects that move and exert force. It will give one (personal) view of what soft actuators and robots are, and how this class of soft devices fits into the more highly developed field of conventional "hard" robotics. It will also suggest how and why soft robotics is more than simply a minor technical "tweak" on hard robotics and propose a unique role for chemistry, and materials science, in this field. Soft robotics is, at its core, intellectually and technologically different from hard robotics, both because it has different objectives and uses and because it relies on the properties of materials to assume many of the roles played by sensors, actuators, and controllers in hard robotics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Impact of substituting added sugar in carbonated soft drinks by intense sweeteners in young adults in the Netherlands: example of a benefit-risk approach.

    Science.gov (United States)

    Hendriksen, Marieke A; Tijhuis, Mariken J; Fransen, Heidi P; Verhagen, Hans; Hoekstra, Jeljer

    2011-02-01

    Substituting added sugar in carbonated soft drinks with intense sweeteners may have potential beneficial, but also adverse health effects. This study assessed the benefits and risks associated with substituting added sugar in carbonated soft drinks with intense sweeteners in young adults in the Netherlands. A tiered approach was used analogous to the risk assessment paradigm, consisting of benefit and hazard identification, exposure assessment and finally benefit and risk characterization and comparison. Two extreme scenarios were compared in which all carbonated soft drinks were sweetened with either intense sweeteners or added sugar. National food consumption survey data were used, and intake of added sugar and intense sweeteners was calculated using the food composition table or analytical data for sweetener content. Reduction in dental caries and body weight were identified as benefits of substituting sugar. The mean difference in total energy intake between the scenarios was 542 kJ per day in men and 357 kJ per day in women, under the assumption that no compensation takes place. In the 100% sweetener scenario, the average BMI decreased 1.7 kg/m(2) in men and 1.3 kg/m(2) in women when compared to the 100% sugar scenario. Risks are negligible, as the intake of intense sweeteners remains below the ADI in the substitution scenario. Substitution of added sugar by intense sweeteners in carbonated soft drinks has beneficial effects on BMI and the reduction in dental caries, and does not seem to have adverse health effects in young adults, given the available knowledge and assumptions made.

  13. Computational approaches to the development of perceptual expertise.

    Science.gov (United States)

    Palmeri, Thomas J; Wong, Alan C-N; Gauthier, Isabel

    2004-08-01

    Dog experts, ornithologists, radiologists and other specialists are noted for their remarkable abilities at categorizing, identifying and recognizing objects within their domain of expertise. A complete understanding of the development of perceptual expertise requires a combination of thorough empirical research and carefully articulated computational theories that formalize specific hypotheses about the acquisition of expertise. A comprehensive computational theory of the development of perceptual expertise remains elusive, but we can look to existing computational models from the object-recognition, perceptual-categorization, automaticity and related literatures for possible starting points. Arguably, hypotheses about the development of perceptual expertise should first be explored within the context of existing computational models of visual object understanding before considering the creation of highly modularized adaptations for particular domains of perceptual expertise.

  14. Gender and Computing in Further Education: A Life Story Approach.

    Science.gov (United States)

    Cox, Anne; And Others

    1994-01-01

    Describes biographical/life story methods used to explore the underrepresentation of women in computing courses in the United Kingdom. Explains how interviews were structured and conducted and illustrates the method of transcription used. (SK)

  15. CART - a Scandinavian approach to computer aided radiation therapy

    International Nuclear Information System (INIS)

    Walstam, R.

    1987-01-01

    The CART project, a program of computer-aided radiation therapy developed as a joint venture by Scandinavian countries is described. The history is outlined and the individual areas of the CART program are listed. (L.O.). 3 refs

  16. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    This thesis presents the investigation of atmospheric molecular clusters using computational methods. Previous investigations have focused on solving problems related to atmospheric nucleation, and have not been targeted at the performance of the applied methods. This thesis focuses on assessing...... the performance of computational strategies in order to identify a sturdy methodology, which should be applicable for handling various issues related to atmospheric cluster formation. Density functional theory (DFT) is applied to study individual cluster formation steps. Utilizing large test sets of numerous...

  17. AN ETHICAL ASSESSMENT OF COMPUTER ETHICS USING SCENARIO APPROACH

    OpenAIRE

    Maslin Masrom; Zuraini Ismail; Ramlah Hussein

    2010-01-01

    Ethics refers to a set of rules that define right and wrong behavior, used for moral decision making. In this case, computer ethics is one of the major issues in information technology (IT) and information system (IS). The ethical behaviour of IT students and professionals need to be studied in an attempt to reduce many unethical practices such as software piracy, hacking, and software intellectual property violations. This paper attempts to address computer-related scenarios that can be used...

  18. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  19. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    Science.gov (United States)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  20. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    Colvin, M; Krishnan, V V

    2003-01-01

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  1. Soft tissue tumors - imaging methods

    International Nuclear Information System (INIS)

    Arlart, I.P.

    1985-01-01

    Soft Tissue Tumors - Imaging Methods: Imaging methods play an important diagnostic role in soft tissue tumors concerning a preoperative evaluation of localization, size, topographic relationship, dignity, and metastatic disease. The present paper gives an overview about diagnostic methods available today such as ultrasound, thermography, roentgenographic plain films and xeroradiography, radionuclide methods, computed tomography, lymphography, angiography, and magnetic resonance imaging. Besides sonography particularly computed tomography has the most important diagnostic value in soft tissue tumors. The application of a recently developed method, the magnetic resonance imaging, cannot yet be assessed in its significance. (orig.) [de

  2. Computationally efficient approach to three-dimensional point cloud reconstruction from video image sequences

    Science.gov (United States)

    Chang, Chih-Hsiang; Kehtarnavaz, Nasser

    2014-05-01

    This paper presents a computationally efficient solution to three-dimensional point cloud reconstruction from video image sequences that are captured by a hand-held camera. Our solution starts with a frame selection step to remove frames that cause physically nonrealizable reconstruction outcomes. Then, a computationally efficient approach for obtaining the absolute camera pose is introduced based on pairwise relative camera poses. This is followed by a computationally efficient rotation registration to update the absolute camera pose. The reconstruction results obtained based on actual video sequences indicate lower computation times and lower reprojection errors of the introduced approach compared to the conventional approach.

  3. A simulation model for visitors’ thermal comfort at urban public squares using non-probabilistic binary-linear classifier through soft-computing methodologies

    International Nuclear Information System (INIS)

    Kariminia, Shahab; Shamshirband, Shahaboddin; Hashim, Roslan; Saberi, Ahmadreza; Petković, Dalibor; Roy, Chandrabhushan; Motamedi, Shervin

    2016-01-01

    Sustaining outdoor life in cities is decreasing because of the recent rapid urbanisation without considering climate-responsive urban design concepts. Such inadvertent climatic modifications at the indoor level have imposed considerable demand on the urban energy resources. It is important to provide comfortable ambient climate at open urban squares. Researchers need to predict the comfortable conditions at such outdoor squares. The main objective of this study is predict the visitors' outdoor comfort indices by using a developed computational model termed as SVM-WAVELET (Support Vector Machines combined with Discrete Wavelet Transform algorithm). For data collection, the field study was conducted in downtown Isfahan, Iran (51°41′ E, 32°37′ N) with hot and arid summers. Based on different environmental elements, four separate locations were monitored across two public squares. Meteorological data were measured simultaneously by surveying the visitors' thermal sensations. According to the subjects' thermal feeling and their characteristics, their level of comfort was estimated. Further, the adapted computational model was used to estimate the visitors’ thermal sensations in terms of thermal comfort indices. The SVM-WAVELET results indicate that R 2 value for input parameters, including Thermal Sensation, PMW (The predicted mean vote), PET (physiologically equivalent temperature), SET (standard effective temperature) and T mrt were estimated at 0.482, 0.943, 0.988, 0.969 and 0.840, respectively. - Highlights: • To explore the visitors' thermal sensation at urban public squares. • This article introduces findings of outdoor comfort prediction. • The developed SVM-WAVELET soft-computing technique was used. • SVM-WAVELET estimation results are more reliable and accurate.

  4. Soft Interfaces

    International Nuclear Information System (INIS)

    Strzalkowski, Ireneusz

    1997-01-01

    This book presents an extended form of the 1994 Dirac Memorial Lecture delivered by Pierre Gilles de Gennes at Cambridge University. The main task of the presentation is to show the beauty and richness of structural forms and phenomena which are observed at soft interfaces between two media. They are much more complex than forms and phenomena existing in each phase separately. Problems are discussed including both traditional, classical techniques, such as the contact angle in static and dynamic partial wetting, as well as the latest research methodology, like 'environmental' scanning electron microscopes. The book is not a systematic lecture on phenomena but it can be considered as a compact set of essays on topics which particularly fascinate the author. The continuum theory widely used in the book is based on a deep molecular approach. The author is particularly interested in a broad-minded rheology of liquid systems at interfaces with specific emphasis on polymer melts. To study this, the author has developed a special methodology called anemometry near walls. The second main topic presented in the book is the problem of adhesion. Molecular processes, energy transformations and electrostatic interaction are included in an interesting discussion of the many aspects of the principles of adhesion. The third topic concerns welding between two polymer surfaces, such as A/A and A/B interfaces. Of great worth is the presentation of various unsolved, open problems. The kind of topics and brevity of description indicate that this book is intended for a well prepared reader. However, for any reader it will present an interesting picture of how many mysterious processes are acting in the surrounding world and how these phenomena are perceived by a Nobel Laureate, who won that prize mainly for his investigations in this field. (book review)

  5. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    OpenAIRE

    J. Bhardwaj; K. K. Gupta; R. Gupta

    2018-01-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor no...

  6. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    Science.gov (United States)

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1).

  7. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot

    International Nuclear Information System (INIS)

    Onal, Cagdas D; Rus, Daniela

    2013-01-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s −1 . (paper)

  8. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  9. miRNA and cancer; computational and experimental approaches.

    Science.gov (United States)

    Tutar, Yusuf

    2014-01-01

    Human genome sequencing was started to solve four letter algorithm of the genome to understand the complex nature of human metabolism. However, after completion of Human Genome Project many scientists realized that sequence information alone was not sufficient to solve the biochemical mechanism of the organism through classical approaches. Non-coding parts of the genome produce small conserved ribonucleic acids, miRNAs to control cellular and physiological processes [1, 2]. This breakthrough discovery directed researches to examine role of miRNA in cancer since miRNAs are involved in the development, cell differentiation, and regulation of cell cycle [3]. The first paper of the special issue provides general information of miRNA in cancer research. This thematic issue presents two computational approaches for miRNA identification and their role in cancer. The first one comes from Dr. Wang and his presented work predicts cancer-related miRNAs by using expression profiles in tumor tissues. The work relies on R-squared method to investigate miRNA-mRNA regulatory relationship between miRNAs and mRNAs from different tissues and predicts miRNAs associated with colon, prostate, pancreatic, lung, breast, bladder, and kidney cancer. The second paper by Allmer et al. examines miRNA-gene regulatory networks and their implications in cancer. Their work provides complex network of expression regulation and miRNAs' role in personalized medicine. miRNAs regulate tumor progression and metastasis by interacting with target genes in the cells. Exosomal shuttle small RNAs mediate cell to cell communication and regulate cancer metastasis. The regulation via heterotypic signals in the microenvironment was explained by Dr. Liang and Dr. Yu groups. The rest of the issue highlights the roles of miRNAs on multiple myeloma, non-small cell lung cancer, urological malignancies, myeloid leukemia, and laryngeal squamous cell carcinoma. Proliferation of bone marrow of malignant plasma cells

  10. Biobetters From an Integrated Computational/Experimental Approach

    Directory of Open Access Journals (Sweden)

    Serdar Kuyucak

    2017-01-01

    Full Text Available Biobetters are new drugs designed from existing peptide or protein-based therapeutics by improving their properties such as affinity and selectivity for the target epitope, and stability against degradation. Computational methods can play a key role in such design problems—by predicting the changes that are most likely to succeed, they can drastically reduce the number of experiments to be performed. Here we discuss the computational and experimental methods commonly used in drug design problems, focusing on the inverse relationship between the two, namely, the more accurate the computational predictions means the less experimental effort is needed for testing. Examples discussed include efforts to design selective analogs from toxin peptides targeting ion channels for treatment of autoimmune diseases and monoclonal antibodies which are the fastest growing class of therapeutic agents particularly for cancers and autoimmune diseases.

  11. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  12. Environmental sciences and computations: a modular data based systems approach

    International Nuclear Information System (INIS)

    Crawford, T.V.; Bailey, C.E.

    1975-07-01

    A major computer code for environmental calculations is under development at the Savannah River Laboratory. The primary aim is to develop a flexible, efficient capability to calculate, for all significant pathways, the dose to man resulting from releases of radionuclides from the Savannah River Plant and from other existing and potential radioactive sources in the southeastern United States. The environmental sciences programs at SRP are described, with emphasis on the development of the calculational system. It is being developed as a modular data-based system within the framework of the larger JOSHUA Computer System, which provides data management, terminal, and job execution facilities. (U.S.)

  13. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  14. Formal Analysis of Soft Errors using Theorem Proving

    Directory of Open Access Journals (Sweden)

    Sofiène Tahar

    2013-07-01

    Full Text Available Modeling and analysis of soft errors in electronic circuits has traditionally been done using computer simulations. Computer simulations cannot guarantee correctness of analysis because they utilize approximate real number representations and pseudo random numbers in the analysis and thus are not well suited for analyzing safety-critical applications. In this paper, we present a higher-order logic theorem proving based method for modeling and analysis of soft errors in electronic circuits. Our developed infrastructure includes formalized continuous random variable pairs, their Cumulative Distribution Function (CDF properties and independent standard uniform and Gaussian random variables. We illustrate the usefulness of our approach by modeling and analyzing soft errors in commonly used dynamic random access memory sense amplifier circuits.

  15. Thermodynamic and relative approach to compute glass-forming ...

    Indian Academy of Sciences (India)

    Abstract. This study deals with the evaluation of glass-forming ability (GFA) of oxides and is a critical reading of. Sun and Rawson thermodynamic approach to quantify this aptitude. Both approaches are adequate but ambiguous regarding the behaviour of some oxides (tendency to amorphization or crystallization). Indeed ...

  16. National Computing Studies Summit: Open Learning Approaches to Computing Studies--An ACCE Discussion Paper

    Science.gov (United States)

    Webb, Ian

    2008-01-01

    In 2005 the Australian Council for Computers in Education (ACCE) was successful in obtaining a grant from National Centre of Science, Information and Communication Technology and Mathematics Education for Rural and Regional Australia (SiMERR) to undertake the Computing Studies Teachers Network Rural and Regional Focus Project. The project had five…

  17. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  18. A New Approach to Concrete Mix Design Using Computer Techniques

    African Journals Online (AJOL)

    If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link ...

  19. Computational approaches for efficiently modelling of small atmospheric clusters

    DEFF Research Database (Denmark)

    Elm, Jonas; Mikkelsen, Kurt Valentin

    2014-01-01

    the basis set used in the geometry and frequency calculation from 6-311++G(3df,3pd) → 6-31++G(d,p) implies a significant speed-up in computational time and only leads to small errors in the thermal contribution to the Gibbs free energy and subsequent coupled cluster single point energy calculation....

  20. THE NEW APPROACH TO MEASURING COMPUTER LITERACY AT THE UWB

    Directory of Open Access Journals (Sweden)

    Tomáš Pribáň

    2011-06-01

    Full Text Available Nowadays computer literacy is one of the required conditions demanded by employer, who wants to success in today’s labor market. Unfortunately so called necessary minimum of knowledge is not exactly defined. In the Czech Republic computer literacy is an important part of basic education. Our educators face the difficult issue of assessing a level of computer literacy among our students. This case study compares two tools used for testing of computer literacy. For this study we selected two testing tools, the Original Testing System (OTS and the Modified Testing System (MTS, and investigated them thoroughly by quantitative comparison. In our case study, this quantitative comparison of those testing tools was done using data collected from a freshman-level Word and Excel course in fall 2010 semester in the University of West Bohemia (UWB. Comparison was based on analysis of the performance of 138 students attending this course. We analyzed correlations between the scores from our two testing instruments. We conducted a paired sample t-test for the sake of comparison of the students’ performance between the examination scores. This study has limitations, which were discussed and based on them, recommendations for future works and directions were included.

  1. a new approach to concrete mix design using computer techniques

    African Journals Online (AJOL)

    Engr. Vincent okoloekwe

    aggregates. Mathematical modeling using the data obtained from literature, computational analysis and experimental investigations, was done with the aid of MATLAB and. SPSS softwares [6]. THE COMPUTERIZED MIX PROCESS. DESIGN. In recent times efforts have been directed towards computerizing the concrete.

  2. A Knowledge-Intensive Approach to Computer Vision Systems

    NARCIS (Netherlands)

    Koenderink-Ketelaars, N.J.J.P.

    2010-01-01

    This thesis focusses on the modelling of knowledge-intensive computer vision tasks. Knowledge-intensive tasks are tasks that require a high level of expert knowledge to be performed successfully. Such tasks are generally performed by a task expert. Task experts have a lot of experience in performing

  3. New approach for ensuring cloud computing security: using data ...

    Indian Academy of Sciences (India)

    Cloud computing is one of the largest developments occurred in the field of information technology during recent years. This model has become more desirable for all institutions, organizations and also for personal use thanks to the storage of 'valuable information' at low costs, access to such information from anywhere in ...

  4. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  5. Individual Differences in Learning Computer Programming: A Social Cognitive Approach

    Science.gov (United States)

    Akar, Sacide Guzin Mazman; Altun, Arif

    2017-01-01

    The purpose of this study is to investigate and conceptualize the ranks of importance of social cognitive variables on university students' computer programming performances. Spatial ability, working memory, self-efficacy, gender, prior knowledge and the universities students attend were taken as variables to be analyzed. The study has been…

  6. New approach for virtual machines consolidation in heterogeneous computing systems

    Czech Academy of Sciences Publication Activity Database

    Fesl, Jan; Cehák, J.; Doležalová, Marie; Janeček, J.

    2016-01-01

    Roč. 9, č. 12 (2016), s. 321-332 ISSN 1738-9968 Institutional support: RVO:60077344 Keywords : consolidation * virtual machine * distributed Subject RIV: JD - Computer Applications, Robotics http://www.sersc.org/journals/IJHIT/vol9_no12_2016/29.pdf

  7. Preparing Students for Computer Aided Drafting (CAD). A Conceptual Approach.

    Science.gov (United States)

    Putnam, A. R.; Duelm, Brian

    This presentation outlines guidelines for developing and implementing an introductory course in computer-aided drafting (CAD) that is geared toward secondary-level students. The first section of the paper, which deals with content identification and selection, includes lists of mechanical drawing and CAD competencies and a list of rationales for…

  8. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  9. A "Service-Learning Approach" to Teaching Computer Graphics

    Science.gov (United States)

    Hutzel, Karen

    2007-01-01

    The author taught a computer graphics course through a service-learning framework to undergraduate and graduate students in the spring of 2003 at Florida State University (FSU). The students in this course participated in learning a software program along with youths from a neighboring, low-income, primarily African-American community. Together,…

  10. IP Addressing: Problem-Based Learning Approach on Computer Networks

    Science.gov (United States)

    Jevremovic, Aleksandar; Shimic, Goran; Veinovic, Mladen; Ristic, Nenad

    2017-01-01

    The case study presented in this paper describes the pedagogical aspects and experience gathered while using an e-learning tool named IPA-PBL. Its main purpose is to provide additional motivation for adopting theoretical principles and procedures in a computer networks course. In the proposed model, the sequencing of activities of the learning…

  11. Soft Clouding

    DEFF Research Database (Denmark)

    Søndergaard, Morten; Markussen, Thomas; Wetton, Barnabas

    2012-01-01

    Soft Clouding is a blended concept, which describes the aim of a collaborative and transdisciplinary project. The concept is a metaphor implying a blend of cognitive, embodied interaction and semantic web. Furthermore, it is a metaphor describing our attempt of curating a new semantics of sound...... archiving. The Soft Clouding Project is part of LARM - a major infrastructure combining research in and access to sound and radio archives in Denmark. In 2012 the LARM infrastructure will consist of more than 1 million hours of radio, combined with metadata who describes the content. The idea is to analyse...... the concept of ‘infrastructure’ and ‘interface’ on a creative play with the fundamentals of LARM (and any sound archive situation combining many kinds and layers of data and sources). This paper will present and discuss the Soft clouding project from the perspective of the three practices and competencies...

  12. A new approach for preventing charging up of soft material samples by coating with conducting polymers in SIMS analysis

    International Nuclear Information System (INIS)

    Mise, Takaya; Ishikawa, Makishi; Nishimoto, Kensaku; Meguro, Takashi

    2008-01-01

    Dynamic secondary ion mass spectroscopy (SIMS) analysis of soft materials such as polymer or biomaterial is one of challenging subjects due to the charge up effect brought from the irradiation of a primary ion beam, hampering the collection of secondary ions. Conventional methods against the charging up are the electron beam irradiation for charge compensation and surface coating with metal, normally gold. Those methods require a compromise analytical condition, reducing the primary ion beam current to suppress the range of the charging, which degrading the performances of the SIMS analyses. We have proposed that a thicker conductive layer, capable of delocalizing the charge onto the surface, should be put on a soft insulator sample to avoid charging up. The depth profile of the hair sample coated wholly with a polythiophen-based conducting polymer was successfully measured in longer time without any charging up even in the maximum current of the oxygen primary ion beam (O 2 + : 7.5 keV, 400 nA) or using an electron beam compensation system. Thus, the proposed method coating with a conductive organic polymer against the charging issue would be expected as a breakthrough on SIMS analysis.

  13. Fundamentals of soft robot locomotion.

    Science.gov (United States)

    Calisti, M; Picardi, G; Laschi, C

    2017-05-01

    Soft robotics and its related technologies enable robot abilities in several robotics domains including, but not exclusively related to, manipulation, manufacturing, human-robot interaction and locomotion. Although field applications have emerged for soft manipulation and human-robot interaction, mobile soft robots appear to remain in the research stage, involving the somehow conflictual goals of having a deformable body and exerting forces on the environment to achieve locomotion. This paper aims to provide a reference guide for researchers approaching mobile soft robotics, to describe the underlying principles of soft robot locomotion with its pros and cons, and to envisage applications and further developments for mobile soft robotics. © 2017 The Author(s).

  14. Examination of tapered plastic multimode fiber-based sensor performance with silver coating for different concentrations of calcium hypochlorite by soft computing methodologies--a comparative study.

    Science.gov (United States)

    Zakaria, Rozalina; Sheng, Ong Yong; Wern, Kam; Shamshirband, Shahaboddin; Wahab, Ainuddin Wahid Abdul; Petković, Dalibor; Saboohi, Hadi

    2014-05-01

    A soft methodology study has been applied on tapered plastic multimode sensors. This study basically used tapered plastic multimode fiber [polymethyl methacrylate (PMMA)] optics as a sensor. The tapered PMMA fiber was fabricated using an etching method involving deionized water and acetone to achieve a waist diameter and length of 0.45 and 10 mm, respectively. In addition, a tapered PMMA probe, which was coated by silver film, was fabricated and demonstrated using a calcium hypochlorite (G70) solution. The working mechanism of such a device is based on the observation increment in the transmission of the sensor that is immersed in solutions at high concentrations. As the concentration was varied from 0 to 6 ppm, the output voltage of the sensor increased linearly. The silver film coating increased the sensitivity of the proposed sensor because of the effective cladding refractive index, which increases with the coating and thus allows more light to be transmitted from the tapered fiber. In this study, the polynomial and radial basis function (RBF) were applied as the kernel function of the support vector regression (SVR) to estimate and predict the output voltage response of the sensors with and without silver film according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf were used in an attempt to minimize the generalization error bound so as to achieve generalized performance. An adaptive neuro-fuzzy interference system (ANFIS) approach was also investigated for comparison. The experimental results showed that improvements in the predictive accuracy and capacity for generalization can be achieved by the SVR_poly approach in comparison to the SVR_rbf methodology. The same testing errors were found for the SVR_poly approach and the ANFIS approach.

  15. Comparative evaluation of soft and hard tissue dimensions in the anterior maxilla using radiovisiography and cone beam computed tomography: A pilot study

    Directory of Open Access Journals (Sweden)

    Savita Mallikarjun

    2016-01-01

    Full Text Available Aims: To assess and compare the thickness of gingiva in the anterior maxilla using radiovisiography (RVG and cone beam computed tomography (CBCT and its correlation with the thickness of underlying alveolar bone. Settings and Design: This cross-sectional study included 10 male subjects in the age group of 20–45 years. Materials and Methods: After analyzing the width of keratinized gingiva of the maxillary right central incisor, the radiographic assessment was done using a modified technique for RVG and CBCT, to measure the thickness of both the labial gingiva and labial plate of alveolar bone at 4 predetermined locations along the length of the root in each case. Statistical Analysis Used: Statistical analysis was performed using Student's t-test and Pearson's correlation test, with the help of statistical software (SPSS V13. Results: No statistically significant differences were obtained in the measurement made using RVG and CBCT. The results of the present study also failed to reveal any significant correlation between the width of gingiva and the alveolar bone in the maxillary anterior region. Conclusions: Within the limitations of this study, it can be concluded that both CBCT and RVG can be used as valuable tools in the assessment of the soft and hard tissue dimensions.

  16. A Rigorous Approach to Relate Enterprise and Computational Viewpoints

    NARCIS (Netherlands)

    Dijkman, R.M.; Quartel, Dick; Ferreira Pires, Luis; van Sinderen, Marten J.

    Multiviewpoint approaches allow stakeholders to design a system from stakeholder-specific viewpoints. By this, a separation of concerns is achieved, which makes designs more manageable. However, to construct a consistent multiviewpoint design, the relations between viewpoints must be defined

  17. A Discrete Approach to Computer-Oriented Calculus.

    Science.gov (United States)

    Gordon, Sheldon P.

    1979-01-01

    Some of the implications and advantages of an instructional approach using results from the calculus of finite differences and finite sums, both for motivation and as tools leading to applications, are discussed. (MP)

  18. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  19. Soft Neutrosophic Ring and Soft Neutrosophic Field

    Directory of Open Access Journals (Sweden)

    Mumtaz Ali

    2014-04-01

    Full Text Available In this paper we extend the theory of neutrosophic rings and neutrosophic fields to soft sets and construct soft neutrosophic rings and soft neutrosophic fields. We also extend neutrosophic ideal theory to form soft neutrosophic ideal over a neutrosophic ring and soft neutrosophic ideal of a soft neutrosophic ring. We have given many examples to illustrate the theory of soft neutrosophic rings and soft neutrosophic fields and display many properties of these. At the end of this paper we gave soft neutrosophic ring homomorphism.

  20. Software approach to automatic patching of analog computer

    Science.gov (United States)

    1973-01-01

    The Automatic Patching Verification program (APV) is described which provides the hybrid computer programmer with a convenient method of performing a static check of the analog portion of his study. The static check insures that the program is patched as specified, and that the computing components being used are operating correctly. The APV language the programmer uses to specify his conditions and interconnections is similar to the FORTRAN language in syntax. The APV control program reads APV source program statements from an assigned input device. Each source program statement is processed immediately after it is read. A statement may select an analog console, set an analog mode, set a potentiometer or DAC, or read from the analog console and perform a test. Statements are read and processed sequentially. If an error condition is detected, an output occurs on an assigned output device. When an end statement is read, the test is terminated.

  1. Computational approach to quantum encoder design for purity optimization

    International Nuclear Information System (INIS)

    Yamamoto, Naoki; Fazel, Maryam

    2007-01-01

    In this paper, we address the problem of designing a quantum encoder that maximizes the minimum output purity of a given decohering channel, where the minimum is taken over all possible pure inputs. This problem is cast as a max-min optimization problem with a rank constraint on an appropriately defined matrix variable. The problem is computationally very hard because it is nonconvex with respect to both the objective function (output purity) and the rank constraint. Despite this difficulty, we provide a tractable computational algorithm that produces the exact optimal solution for codespace of dimension 2. Moreover, this algorithm is easily extended to cover the general class of codespaces, in which case the solution is suboptimal in the sense that the suboptimized output purity serves as a lower bound of the exact optimal purity. The algorithm consists of a sequence of semidefinite programmings and can be performed easily. Two typical quantum error channels are investigated to illustrate the effectiveness of our method

  2. TOWARD HIGHLY SECURE AND AUTONOMIC COMPUTING SYSTEMS: A HIERARCHICAL APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hsien-Hsin S

    2010-05-11

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniques and system software for achieving a robust, secure, and reliable computing system toward our goal.

  3. Designing intelligent computer-based simulations: a pragmatic approach

    Directory of Open Access Journals (Sweden)

    Bernard M. Garrett

    2001-12-01

    Full Text Available There has been great interest in the potential use of multimedia computer-based learning (CBL packages within higher education. The effectiveness of such systems, however, remains controversial. There are suggestions that such multimedia applications may hold no advantage over traditional formats (Barron and Atkins, 1994; Ellis, 1994; Laurillard, 1995; Simms, 1997; Leibowitz, 1999. One area where multimedia CBL may still prove its value is in the simulation of activities where experiential learning is expensive, undesirable or even dangerous.

  4. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; N Ramesh; K Vijayalakshmi; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  5. Assessing Trustworthiness in Social Media: A Social Computing Approach

    Science.gov (United States)

    2015-11-17

    by leveraging data mining and machine learning techniques to enable the computational understanding of distrust with social media data. The first task...applications such as targeted advertisements or real- time monitoring of political opinions. Huge amounts of data generated by social media users present...research on information provenance, explores exciting research opportunities to address pressing needs, and shows how data mining can enable a social media

  6. Modeling Cu{sup 2+}-Aβ complexes from computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Alí-Torres, Jorge [Departamento de Química, Universidad Nacional de Colombia- Sede Bogotá, 111321 (Colombia); Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona, E-mail: Mariona.Sodupe@uab.cat [Departament de Química, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2015-09-15

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu{sup 2+} metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu{sup 2+}-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu{sup 2+}-Aβ coordination and build plausible Cu{sup 2+}-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  7. Soft Clouding

    DEFF Research Database (Denmark)

    Søndergaard, Morten; Markussen, Thomas; Wetton, Barnabas

    2012-01-01

    Soft Clouding is a blended concept, which describes the aim of a collaborative and transdisciplinary project. The concept is a metaphor implying a blend of cognitive, embodied interaction and semantic web. Furthermore, it is a metaphor describing our attempt of curating a new semantics of sound a...

  8. When soft controls get slippery: User interfaces and human error

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J.M.

    1998-01-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety

  9. Computational approach to the study of thermal spin crossover phenomena

    International Nuclear Information System (INIS)

    Rudavskyi, Andrii; Broer, Ria; Sousa, Carmen; Graaf, Coen de; Havenith, Remco W. A.

    2014-01-01

    The key parameters associated to the thermally induced spin crossover process have been calculated for a series of Fe(II) complexes with mono-, bi-, and tridentate ligands. Combination of density functional theory calculations for the geometries and for normal vibrational modes, and highly correlated wave function methods for the energies, allows us to accurately compute the entropy variation associated to the spin transition and the zero-point corrected energy difference between the low- and high-spin states. From these values, the transition temperature, T 1/2 , is estimated for different compounds

  10. Changes to a modelling approach with the use of computer

    DEFF Research Database (Denmark)

    Andresen, Mette

    2006-01-01

    of teaching materials on differential equations. One of the objectives of the project was changes at two levels: 1) Changes at curriculum level and 2) Changes in the intentions of modelling and using models. The paper relates the changes at these two levels and discusses how the use of computer can serve......This paper reports on a Ph.D. project, which was part of a larger research- and development project (see www.matnatverdensklasse.dk). In the reported part of the project, each student had had a laptop at his disposal for at least two years. The Ph.D. project inquires the try out in four classes...

  11. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  12. A Digital Computer Approach to the Unsymmetric Rigid Body Problem.

    Science.gov (United States)

    1982-03-01

    i A (1 2 I )1ii B (1 3 11)/12 C (1~ 1 1 2)/13 1412 I F, G, H Intermediate expression for computations of w 1 9 w 2 9 w3 d1, d 2, ei, e 2 trial...Staude problem. 6. Grammel (1948): Use of approximating functions in a recursive scheme. A brief description of Grammel’s work can be found in Section...HAMILTON’S EQUATIONS Hamilton’s equations are more general than the methods of Euler and Lagrange. The Hamiltonian, H , is related to the Lagrangian, L, by

  13. Diffusive Wave Approximation to the Shallow Water Equations: Computational Approach

    KAUST Repository

    Collier, Nathan

    2011-05-14

    We discuss the use of time adaptivity applied to the one dimensional diffusive wave approximation to the shallow water equations. A simple and computationally economical error estimator is discussed which enables time-step size adaptivity. This robust adaptive time discretization corrects the initial time step size to achieve a user specified bound on the discretization error and allows time step size variations of several orders of magnitude. In particular, in the one dimensional results presented in this work feature a change of four orders of magnitudes for the time step over the entire simulation.

  14. New Approaches to Practical Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nordholt, Peter Sebastian

    all practical protocols with malicious security were based on Yao’s garbled circuits. We report on an implementation of this protocol demonstrating its high efficiency. For larger circuits it evaluates 20000 Boolean gates per second. As an example, evaluating one oblivious AES encryption (around 34000......-party computation protocols based on Yao’s garbled circuit. Namely, doing the cut-n-choose test on the gate level instead of the circuit level. This idea speeds up the protocol by a factor the logarithm of the size of the circuit to be evaluated. The resulting protocol, however, was not considered practically...

  15. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  16. A complex systems approach to computational molecular biology

    Energy Technology Data Exchange (ETDEWEB)

    Lapedes, A. [Los Alamos National Lab., NM (United States)]|[Santa Fe Inst., NM (United States)

    1993-09-01

    We report on the containing research program at Santa Fe Institute that applies complex systems methodology to computational molecular biology. Two aspects are stressed here are the use of co-evolving adaptive neutral networks for determining predictable protein structure classifications, and the use of information theory to elucidate protein structure and function. A ``snapshot`` of the current state of research in these two topics is presented, representing the present state of two major research thrusts in the program of Genetic Data and Sequence Analysis at the Santa Fe Institute.

  17. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  18. The softest computer

    Science.gov (United States)

    Caulfield, H. John; Johnson, John L.

    2000-10-01

    Neural networks, fuzzy systems, and evolutionary computation - the technologies featured in this conference - are often referred to collectively as soft computing. Like fuzzy logic this has an unfortunate overtone, but we are stuck with the term. I argue that the intuition that governed this grouping of technologies is that they are all human mimetic. Then I offer an overview of how to construct an artifact that, like you, uses those approaches to exercise judgement and innovate out of unforeseen problems.

  19. Synergy between experimental and computational approaches to homogeneous photoredox catalysis.

    Science.gov (United States)

    Demissie, Taye B; Hansen, Jørn H

    2016-07-05

    In this Frontiers article, we highlight how state-of-the-art density functional theory calculations can contribute to the field of homogeneous photoredox catalysis. We discuss challenges in the fields and potential solutions to be found at the interface between theory and experiment. The exciting opportunities and insights that can arise through such an interdisciplinary approach are highlighted.

  20. A Computational Approach to the Quantification of Animal Camouflage

    Science.gov (United States)

    2014-06-01

    Geoff Gorman, Wu- Jung Lee, Heather Beem, Audrey Maertens, Kalina Gospodinova, Ankita Jain, Nick Loomis, Stephanie Petillo, Toby Schneider and...addition to sexual selection, animals use color vision for behavioral tasks including finding food, recognizing objects and communicating with...As cuttlefish approach sexual maturity, their skin undergoes a physiological and morphological change where iridophores and leucophores develop to

  1. Thermodynamic and relative approach to compute glass-forming ...

    Indian Academy of Sciences (India)

    We present a non-dimensional approach to value GFA of single oxide by affecting to each one of the coefficients (without measuring units). Obeying to the non-dimensional analysis rules, we introduce a neglected (in all prior thermodynamic models) characteristic: the isobaric heat capacity (p) of oxides, and execute a ...

  2. A uniform approach for programming distributed heterogeneous computing systems.

    Science.gov (United States)

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  3. Computational approaches for modeling human intestinal absorption and permeability.

    Science.gov (United States)

    Subramanian, Govindan; Kitchen, Douglas B

    2006-07-01

    Human intestinal absorption (HIA) is an important roadblock in the formulation of new drug substances. Computational models are needed for the rapid estimation of this property. The measurements are determined via in vivo experiments or in vitro permeability studies. We present several computational models that are able to predict the absorption of drugs by the human intestine and the permeability through human Caco-2 cells. The training and prediction sets were derived from literature sources and carefully examined to eliminate compounds that are actively transported. We compare our results to models derived by other methods and find that the statistical quality is similar. We believe that models derived from both sources of experimental data would provide greater consistency in predictions. The performance of several QSPR models that we investigated to predict outside the training set for either experimental property clearly indicates that caution should be exercised while applying any of the models for quantitative predictions. However, we are able to show that the qualitative predictions can be obtained with close to a 70% success rate.

  4. Establishing a central zone in scaphoid surgery: a computational approach.

    Science.gov (United States)

    Guo, Yang; Tian, Guang Lei; Chen, Shanlin; Tapia, Carla

    2014-01-01

    Scaphoid fractures are commonly fixed with headless cannulated screws positioned centrally in the scaphoid. Judgement of central placement of the screw may be difficult. We generated a central zone using computer analysis of 3D reconstructions of computed tomography (CT) images. As long as the screw axis is completely contained within this central zone, the screw would be considered as centrally placed. Thirty cases of 3D CT reconstructions of normal scaphoids in a computerised operation planning and simulation system (Vxwork software) were obtained. The central zone was established after some distance shrinkage of the original scaphoid surface reconstruction model using the function "erode" in the software. The shape of the central zone was evaluated, and the width of the central zone in the proximal pole, waist portion and distal pole was measured. We also established the long axis of the scaphoid to see whether it stays in the central zone. All central zones could be divided into distal, waist and proximal portions according to the corresponding irregular shape of the scaphoid. As the geometry of the central zone was so irregular and its width very narrow, it was possible to completely contain the screw axis either in the proximal portion alone, waist alone or distal central zone alone. Establishing the central zone of scaphoid 3D CT images provided a baseline for discussion of central placement of a scaphoid screw. The geometry of the scaphoid central zone determined that the screw could hardly be inserted through entire scaphoid central area during surgery.

  5. Novel approach for dam break flow modeling using computational intelligence

    Science.gov (United States)

    Seyedashraf, Omid; Mehrabi, Mohammad; Akhtari, Ali Akbar

    2018-04-01

    A new methodology based on the computational intelligence (CI) system is proposed and tested for modeling the classic 1D dam-break flow problem. The reason to seek for a new solution lies in the shortcomings of the existing analytical and numerical models. This includes the difficulty of using the exact solutions and the unwanted fluctuations, which arise in the numerical results. In this research, the application of the radial-basis-function (RBF) and multi-layer-perceptron (MLP) systems is detailed for the solution of twenty-nine dam-break scenarios. The models are developed using seven variables, i.e. the length of the channel, the depths of the up-and downstream sections, time, and distance as the inputs. Moreover, the depths and velocities of each computational node in the flow domain are considered as the model outputs. The models are validated against the analytical, and Lax-Wendroff and MacCormack FDM schemes. The findings indicate that the employed CI models are able to replicate the overall shape of the shock- and rarefaction-waves. Furthermore, the MLP system outperforms RBF and the tested numerical schemes. A new monolithic equation is proposed based on the best fitting model, which can be used as an efficient alternative to the existing piecewise analytic equations.

  6. A 3D computer graphics approach to brachytherapy planning.

    Science.gov (United States)

    Weichert, Frank; Wawro, Martin; Wilke, Carsten

    2004-06-01

    Intravascular brachytherapy (IVB) can significantly reduce the risk of restenosis after interventional treatment of stenotic arteries, if planned and applied correctly. In order to facilitate computer-based IVB planning, a three-dimensional reconstruction of the stenotic artery based on intravascular ultrasound (IVUS) sequences is desirable. For this purpose, the frames of the IVUS sequence are properly aligned in space, possible gaps inbetween the IVUS frames are filled by interpolation with radial basis functions known from scattered data interpolation. The alignment procedure uses additional information which is obtained from biplane X-ray angiography performed simultaneously during the capturing of the IVUS sequence. After IVUS images and biplane angiography data are acquired from the patient, the vessel-wall borders and the IVUS catheter are detected by an active contour algorithm. Next, the twist (relative orientation) between adjacent IVUS frames is determined by a sequential triangulation method. The absolute orientation of each frame is established by a stochastic analysis based on anatomical landmarks. Finally, the reconstructed 3D vessel model is visualized by methods of combined volume and polygon rendering. The reconstruction is then used for the computation of the radiation-distribution within the tissue, emitted from a beta-radiation source. All these steps are performed during the percutaneous intervention.

  7. A Computational Differential Geometry Approach to Grid Generation

    CERN Document Server

    Liseikin, Vladimir D

    2007-01-01

    The process of breaking up a physical domain into smaller sub-domains, known as meshing, facilitates the numerical solution of partial differential equations used to simulate physical systems. This monograph gives a detailed treatment of applications of geometric methods to advanced grid technology. It focuses on and describes a comprehensive approach based on the numerical solution of inverted Beltramian and diffusion equations with respect to monitor metrics for generating both structured and unstructured grids in domains and on surfaces. In this second edition the author takes a more detailed and practice-oriented approach towards explaining how to implement the method by: Employing geometric and numerical analyses of monitor metrics as the basis for developing efficient tools for controlling grid properties. Describing new grid generation codes based on finite differences for generating both structured and unstructured surface and domain grids. Providing examples of applications of the codes to the genera...

  8. An Approach to Dynamic Service Management in Pervasive Computing Systems

    Science.gov (United States)

    2005-01-01

    the state management to the Services themselves, with the Service Manager serving as the cache. The advantage of such approach is the decreased...complexity of distributed state management and increased fault tolerance. Even in the event of Service Manager go- ing down, it can recover easily because...based services, whereas our system is able to support Services based on any platform, as long they can communi- cate with either the Service Manager through

  9. Engineering approach to model and compute electric power markets settlements

    International Nuclear Information System (INIS)

    Kumar, J.; Petrov, V.

    2006-01-01

    Back-office accounting settlement activities are an important part of market operations in Independent System Operator (ISO) organizations. A potential way to measure ISO market design correctness is to analyze how well market price signals create incentives or penalties for creating an efficient market to achieve market design goals. Market settlement rules are an important tool for implementing price signals which are fed back to participants via the settlement activities of the ISO. ISO's are currently faced with the challenge of high volumes of data resulting from the increasing size of markets and ever-changing market designs, as well as the growing complexity of wholesale energy settlement business rules. This paper analyzed the problem and presented a practical engineering solution using an approach based on mathematical formulation and modeling of large scale calculations. The paper also presented critical comments on various differences in settlement design approaches to electrical power market design, as well as further areas of development. The paper provided a brief introduction to the wholesale energy market settlement systems and discussed problem formulation. An actual settlement implementation framework and discussion of the results and conclusions were also presented. It was concluded that a proper engineering approach to this domain can yield satisfying results by formalizing wholesale energy settlements. Significant improvements were observed in the initial preparation phase, scoping and effort estimation, implementation and testing. 5 refs., 2 figs

  10. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  11. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    Science.gov (United States)

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  12. Method in computer ethics: Towards a multi-level interdisciplinary approach

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2000-01-01

    This essay considers methodological aspects ofcomputer ethics and argues for a multi-levelinterdisciplinary approach with a central role forwhat is called disclosive computer ethics. Disclosivecomputer ethics is concerned with the moraldeciphering of embedded values and norms in computersystems,

  13. A computational approach to the twin paradox in curved spacetime

    Science.gov (United States)

    Fung, Kenneth K. H.; Clark, Hamish A.; Lewis, Geraint F.; Wu, Xiaofeng

    2016-09-01

    Despite being a major component in the teaching of special relativity, the twin ‘paradox’ is generally not examined in courses on general relativity. Due to the complexity of analytical solutions to the problem, the paradox is often neglected entirely, and students are left with an incomplete understanding of the relativistic behaviour of time. This article outlines a project, undertaken by undergraduate physics students at the University of Sydney, in which a novel computational method was derived in order to predict the time experienced by a twin following a number of paths between two given spacetime coordinates. By utilising this method, it is possible to make clear to students that following a geodesic in curved spacetime does not always result in the greatest experienced proper time.

  14. Computer aided fixture design - A case based approach

    Science.gov (United States)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  15. Perturbation approach for nuclear magnetic resonance solid-state quantum computation

    Directory of Open Access Journals (Sweden)

    G. P. Berman

    2003-01-01

    Full Text Available A dynamics of a nuclear-spin quantum computer with a large number (L=1000 of qubits is considered using a perturbation approach. Small parameters are introduced and used to compute the error in an implementation of an entanglement between remote qubits, using a sequence of radio-frequency pulses. The error is computed up to the different orders of the perturbation theory and tested using exact numerical solution.

  16. A Two Layer Approach to the Computability and Complexity of Real Functions

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2003-01-01

    We present a new model for computability and complexity of real functions together with an implementation that it based on it. The model uses a two-layer approach in which low-type basic objects perform the computation of a real function, but, whenever needed, can be complemented with higher type...

  17. A simplified approach to compute distribution matrices for the mapping method

    NARCIS (Netherlands)

    Singh, M.K.; Galaktionov, O.S.; Meijer, H.E.H.; Anderson, P.D.

    2009-01-01

    The mapping method has proven its efficiency as an analysis and optimization tool for mixing in many different flow devices. In this paper, we present a new approach to compute the coefficients of the distribution matrix, which is, both in terms of computational speed and complexity, more easy to

  18. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...... are generated. Medical experts comment on the nature, the meaning and the usability of the acquired results....

  19. A Computational Approach for Automated Posturing of a Human Finite Element Model

    Science.gov (United States)

    2016-07-01

    Std. Z39.18 July 2016 Memorandum Report A Computational Approach for Automated Posturing of a Human Finite Element Model Justin McKee and Adam...ARL-MR-0934• JULY 2016 US Army Research Laboratory A Computational Approach for Automated Posturing of a Human Finite ElementModel by Justin McKee... Automated Posturing of a Human Finite ElementModel by Justin McKee Bennett Aerospace, Inc., Cary, NC Adam Sokolow Weapons and Materials Research

  20. A visibility-based Approach for Occupancy Grid Computation in Disparity Space

    OpenAIRE

    Perrollaz, Mathias; Yoder, John-David; Nègre, Amaury; Spalanzani, Anne; Laugier, Christian

    2012-01-01

    Occupancy grids are a very convenient tool for environment representation in robotics. This paper will detail a novel approach to compute occupancy grids from stereo-vision, and shows its application for the field of intelligent vehicles. In the proposed approach, occupancy is initially computed directly in the stereoscopic sensor's disparity space. The calculation formally accounts for the detection of obstacles and road pixels in disparity space, as well as partial occlusions in the scene. ...

  1. The influence of antiscatter grids on soft-tissue detectability in cone-beam computed tomography with flat-panel detectors

    International Nuclear Information System (INIS)

    Siewerdsen, J.H.; Moseley, D.J.; Bakhtiar, B.; Richard, S.; Jaffray, D.A.

    2004-01-01

    The influence of antiscatter x-ray grids on image quality in cone-beam computed tomography (CT) is evaluated through broad experimental investigation for various anatomical sites (head and body), scatter conditions (scatter-to-primary ratio (SPR) ranging from ∼10% to 150%), patient dose, and spatial resolution in three-dimensional reconstructions. Studies involved linear grids in combination with a flat-panel imager on a system for kilovoltage cone-beam CT imaging and guidance of radiation therapy. Grids were found to be effective in reducing x-ray scatter 'cupping' artifacts, with heavier grids providing increased image uniformity. The system was highly robust against ring artifacts that might arise in CT reconstructions as a result of gridline shadows in the projection data. The influence of grids on soft-tissue detectability was evaluated quantitatively in terms of absolute contrast, voxel noise, and contrast-to-noise ratio (CNR) in cone-beam CT reconstructions of 16 cm 'head' and 32 cm 'body' cylindrical phantoms. Imaging performance was investigated qualitatively in observer preference tests based on patient images (pelvis, abdomen, and head-and-neck sites) acquired with and without antiscatter grids. The results suggest that although grids reduce scatter artifacts and improve subject contrast, there is little strong motivation for the use of grids in cone-beam CT in terms of CNR and overall image quality under most circumstances. The results highlight the tradeoffs in contrast and noise imparted by grids, showing improved image quality with grids only under specific conditions of high x-ray scatter (SPR>100%), high imaging dose (D center >2 cGy), and low spatial resolution (voxel size ≥1 mm)

  2. Monitoring the Microgravity Environment Quality On-board the International Space Station Using Soft Computing Techniques. Part 2; Preliminary System Performance Results

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.

    2002-01-01

    and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  3. [18F]-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography of LAPC4-CR Castration-Resistant Prostate Cancer Xenograft Model in Soft Tissue Compartments1

    Science.gov (United States)

    McCall, Keisha C.; Cheng, Su-Chun; Huang, Ying; Kohl, Nancy E.; Tupper, Tanya; Van den Abbeele, Annick D.; Zukotynski, Katherine A.; Sweeney, Christopher J.

    2015-01-01

    Preclinical xenograft models have contributed to advancing our understanding of the molecular basis of prostate cancer and to the development of targeted therapy. However, traditional preclinical in vivo techniques using caliper measurements and survival analysis evaluate the macroscopic tumor behavior, whereas tissue sampling disrupts the microenvironment and cannot be used for longitudinal studies in the same animal. Herein, we present an in vivo study of [18F]-fluorodeoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) designed to evaluate the metabolism within the microenvironment of LAPC4-CR, a unique murine model of castration-resistant prostate cancer. Mice bearing LAPC4-CR subcutaneous tumors were administered [18F]-FDG via intravenous injection. After a 60-minute distribution phase, the mice were imaged on a PET/CT scanner with submillimeter resolution; and the fused PET/CT images were analyzed to evaluate tumor size, location, and metabolism across the cohort of mice. The xenograft tumors showed [18F]-FDG uptake that was independent of tumor size and was significantly greater than uptake in skeletal muscle and liver in mice (Wilcoxon signed-rank P values of .0002 and .0002, respectively). [18F]-FDG metabolism of the LAPC4-CR tumors was 2.1 ± 0.8 ID/cm3*wt, with tumor to muscle ratio of 7.4 ± 4.7 and tumor to liver background ratio of 6.7 ± 2.3. Noninvasive molecular imaging techniques such as PET/CT can be used to probe the microenvironment of tumors in vivo. This study showed that [18F]-FDG-PET/CT could be used to image and assess glucose metabolism of LAPC4-CR xenografts in vivo. Further work can investigate the use of PET/CT to quantify the metabolic response of LAPC4-CR to novel agents and combination therapies using soft tissue and possibly bone compartment xenograft models. PMID:26055171

  4. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  5. Computational Approach to large Scale Process Optimization through Pinch Analysis

    Directory of Open Access Journals (Sweden)

    Nasser Al-Azri

    2015-08-01

    Full Text Available Since its debut in the last quarter of the twentieth century, pinch technology has become an efficient tool for efficient and cost-effective engineering process design. This method allows the integration of mass and heat streams in such a way that minimizes waste and external purchase of mass and utilities. Moreover, integrating process streams internally will minimize fuel consumption and hence carbon emission to the atmosphere. This paper discusses a programmable approach to the design of mass and heat exchange networks that can be used easily for large scale engineering processes.

  6. Mini-Open Sinus Tarsi Approach with Percutaneous Screw Fixation of Displaced Calcaneal Fractures: A Prospective Computed Tomography-Based Study

    NARCIS (Netherlands)

    Nosewicz, Tomasz; Knupp, Markus; Barg, Alexej; Maas, Mario; Bolliger, Lilianna; Goslings, J. Carel; Hintermann, Beat

    2012-01-01

    Background: Open reduction and internal fixation (ORIF) of calcaneal fractures using an extended lateral approach results in soft tissue disruption and theoretically subtalar joint stiffness. A minimally invasive sinus tarsi approach for posterior facet exposure and percutaneous screw fixation of

  7. Soft computing in advanced robotics

    CERN Document Server

    Kobayashi, Ichiro; Kim, Euntai

    2014-01-01

    Intelligent system and robotics are inevitably bound up; intelligent robots makes embodiment of system integration by using the intelligent systems. We can figure out that intelligent systems are to cell units, while intelligent robots are to body components. The two technologies have been synchronized in progress. Making leverage of the robotics and intelligent systems, applications cover boundlessly the range from our daily life to space station; manufacturing, healthcare, environment, energy, education, personal assistance, logistics. This book aims at presenting the research results in relevance with intelligent robotics technology. We propose to researchers and practitioners some methods to advance the intelligent systems and apply them to advanced robotics technology. This book consists of 10 contributions that feature mobile robots, robot emotion, electric power steering, multi-agent, fuzzy visual navigation, adaptive network-based fuzzy inference system, swarm EKF localization and inspection robot. Th...

  8. Soft computing in intelligent control

    CERN Document Server

    Jung, Jin-Woo; Kubota, Naoyuki

    2014-01-01

    Nowadays, people have tendency to be fond of smarter machines that are able to collect data, make learning, recognize things, infer meanings, communicate with human and perform behaviors. Thus, we have built advanced intelligent control affecting all around societies; automotive, rail, aerospace, defense, energy, healthcare, telecoms and consumer electronics, finance, urbanization. Consequently, users and consumers can take new experiences through the intelligent control systems. We can reshape the technology world and provide new opportunities for industry and business, by offering cost-effective, sustainable and innovative business models. We will have to know how to create our own digital life. The intelligent control systems enable people to make complex applications, to implement system integration and to meet society’s demand for safety and security. This book aims at presenting the research results and solutions of applications in relevance with intelligent control systems. We propose to researchers ...

  9. Soft computing in machine learning

    CERN Document Server

    Park, Jooyoung; Inoue, Atsushi

    2014-01-01

    As users or consumers are now demanding smarter devices, intelligent systems are revolutionizing by utilizing machine learning. Machine learning as part of intelligent systems is already one of the most critical components in everyday tools ranging from search engines and credit card fraud detection to stock market analysis. You can train machines to perform some things, so that they can automatically detect, diagnose, and solve a variety of problems. The intelligent systems have made rapid progress in developing the state of the art in machine learning based on smart and deep perception. Using machine learning, the intelligent systems make widely applications in automated speech recognition, natural language processing, medical diagnosis, bioinformatics, and robot locomotion. This book aims at introducing how to treat a substantial amount of data, to teach machines and to improve decision making models. And this book specializes in the developments of advanced intelligent systems through machine learning. It...

  10. Soft computing in artificial intelligence

    CERN Document Server

    Matson, Eric

    2014-01-01

    This book explores the concept of artificial intelligence based on knowledge-based algorithms. Given the current hardware and software technologies and artificial intelligence theories, we can think of how efficient to provide a solution, how best to implement a model and how successful to achieve it. This edition provides readers with the most recent progress and novel solutions in artificial intelligence. This book aims at presenting the research results and solutions of applications in relevance with artificial intelligence technologies. We propose to researchers and practitioners some methods to advance the intelligent systems and apply artificial intelligence to specific or general purpose. This book consists of 13 contributions that feature fuzzy (r, s)-minimal pre- and β-open sets, handling big coocurrence matrices, Xie-Beni-type fuzzy cluster validation, fuzzy c-regression models, combination of genetic algorithm and ant colony optimization, building expert system, fuzzy logic and neural network, ind...

  11. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  12. Granular computing and decision-making interactive and iterative approaches

    CERN Document Server

    Chen, Shyi-Ming

    2015-01-01

    This volume is devoted to interactive and iterative processes of decision-making– I2 Fuzzy Decision Making, in brief. Decision-making is inherently interactive. Fuzzy sets help realize human-machine communication in an efficient way by facilitating a two-way interaction in a friendly and transparent manner. Human-centric interaction is of paramount relevance as a leading guiding design principle of decision support systems.   The volume provides the reader with an updated and in-depth material on the conceptually appealing and practically sound methodology and practice of I2 Fuzzy Decision Making. The book engages a wealth of methods of fuzzy sets and Granular Computing, brings new concepts, architectures and practice of fuzzy decision-making providing the reader with various application studies.   The book is aimed at a broad audience of researchers and practitioners in numerous disciplines in which decision-making processes play a pivotal role and serve as a vehicle to produce solutions to existing prob...

  13. A functional analytic approach to computer-interactive mathematics.

    Science.gov (United States)

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M; Ninness, Sharon K

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed.

  14. Strategic cognitive sequencing: a computational cognitive neuroscience approach.

    Science.gov (United States)

    Herd, Seth A; Krueger, Kai A; Kriete, Trenton E; Huang, Tsung-Ren; Hazy, Thomas E; O'Reilly, Randall C

    2013-01-01

    We address strategic cognitive sequencing, the "outer loop" of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC) and basal ganglia (BG) cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or "self-instruction"). The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a "bridging" state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  15. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  16. Computational Approaches for Modeling the Multiphysics in Pultrusion Process

    DEFF Research Database (Denmark)

    Carlone, P.; Baran, Ismet; Hattel, Jesper Henri

    2013-01-01

    Pultrusion is a continuousmanufacturing process used to produce high strength composite profiles with constant cross section.The mutual interactions between heat transfer, resin flow and cure reaction, variation in the material properties, and stress/distortion evolutions strongly affect the proc......Pultrusion is a continuousmanufacturing process used to produce high strength composite profiles with constant cross section.The mutual interactions between heat transfer, resin flow and cure reaction, variation in the material properties, and stress/distortion evolutions strongly affect...... the process dynamics together with the mechanical properties and the geometrical precision of the final product. In the present work, pultrusion process simulations are performed for a unidirectional (UD) graphite/epoxy composite rod including several processing physics, such as fluid flow, heat transfer......, chemical reaction, and solid mechanics. The pressure increase and the resin flow at the tapered inlet of the die are calculated by means of a computational fluid dynamics (CFD) finite volume model. Several models, based on different homogenization levels and solution schemes, are proposed and compared...

  17. Promises and Pitfalls of Computer-Supported Mindfulness: Exploring a Situated Mobile Approach

    Directory of Open Access Journals (Sweden)

    Ralph Vacca

    2017-12-01

    Full Text Available Computer-supported mindfulness (CSM is a burgeoning area filled with varied approaches such as mobile apps and EEG headbands. However, many of the approaches focus on providing meditation guidance. The ubiquity of mobile devices may provide new opportunities to support mindfulness practices that are more situated in everyday life. In this paper, a new situated mindfulness approach is explored through a specific mobile app design. Through an experimental design, the approach is compared to traditional audio-based mindfulness meditation, and a mind wandering control, over a one-week period. The study demonstrates the viability for a situated mobile mindfulness approach to induce mindfulness states. However, phenomenological aspects of the situated mobile approach suggest both promises and pitfalls for computer-supported mindfulness using a situated approach.

  18. Further links between the maximum hardness principle and the hard/soft acid/base principle: insights from hard/soft exchange reactions.

    Science.gov (United States)

    Chattaraj, Pratim K; Ayers, Paul W; Melin, Junia

    2007-08-07

    Ayers, Parr, and Pearson recently showed that insight into the hard/soft acid/base (HSAB) principle could be obtained by analyzing the energy of reactions in hard/soft exchange reactions, i.e., reactions in which a soft acid replaces a hard acid or a soft base replaces a hard base [J. Chem. Phys., 2006, 124, 194107]. We show, in accord with the maximum hardness principle, that the hardness increases for favorable hard/soft exchange reactions and decreases when the HSAB principle indicates that hard/soft exchange reactions are unfavorable. This extends the previous work of the authors, which treated only the "double hard/soft exchange" reaction [P. K. Chattaraj and P. W. Ayers, J. Chem. Phys., 2005, 123, 086101]. We also discuss two different approaches to computing the hardness of molecules from the hardness of the composing fragments, and explain how the results differ. In the present context, it seems that the arithmetic mean of fragment softnesses is the preferable definition.

  19. Computational and Experimental Approaches to Cancer Biomarker Discovery

    DEFF Research Database (Denmark)

    Krzystanek, Marcin

    at least two fundamental mechanisms responsible for DNA aberrations present in a given tumor: 1) active mutational processes caused either by endogenous or exogenous factors, for example chemical agents such as tobacco smoke or cancer cytotoxics, or by active enzymatic processes such as APOBEC related...... with a purely biological, experimental approach where the effects of treatment with cytotoxic agents or defects in DNA repair mechanisms can be individually quantified and turned into mutational signatures.In the second part of the thesis I present work towards identification and improvement of the current...... are expected.This work, together with manifold of efforts being done all over the world, is hopefully a step towards implementation of personalized medicine and better treatments for cancer patients. ...

  20. Computed tomography of the lung. A pattern approach

    Energy Technology Data Exchange (ETDEWEB)

    Verschakelen, J.A.; Wever, W. de [Univ. Hospitals Leuven (Belgium). Dept. of Radiology

    2007-07-01

    As a result of the introduction of multidetector CT, very detailed images of the lungs can be obtained in every patient undergoing chest CT. Interpretation of the findings requires good knowledge and understanding of the CT signs of all the more common pulmonary diseases. In the first part of the book, the main appearance patterns of lung disease are described with the help of many colour drawings and high-quality illustrations. This approach will enable the reader to recognise these patterns and to interpret them in order to reach a diagnosis. In the second part, many typical cases are shown which will assist the reader in applying what was learned in the first part. This concise, easy to use and didactic book will help the radiologist in training to learn and understand the CT features of lung disease and is also recommended to more experienced specialists wishing to update their knowledge in the field. (orig.)