WorldWideScience

Sample records for rule-based down-scaling methodology

  1. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Directory of Open Access Journals (Sweden)

    Rosalyn R. Porle

    2005-08-01

    Full Text Available We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI system. The NAVI has a single board processing system (SBPS, a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  2. Rule-based Expert Systems for Selecting Information Systems Development Methodologies

    Directory of Open Access Journals (Sweden)

    Abdel Nasser H. Zaied

    2013-08-01

    Full Text Available Information Systems (IS are increasingly becoming regarded as crucial to an organization's success. Information Systems Development Methodologies (ISDMs are used by organizations to structure the information system development process. ISDMs are essential for structuring project participants’ thinking and actions; therefore ISDMs play an important role to achieve successful projects. There are different ISDMs and no methodology can claim that it can be applied to any organization. The problem facing decision makers is how to select an appropriate development methodology that may increase the probability of system success. This paper takes this issue into account when study ISDMs and provides a Rule-based Expert System as a tool for selecting appropriate ISDMs. The proposed expert system consists of three main phases to automate the process of selecting ISDMs.Three approaches were used to test the proposed expert system. Face validation through six professors and six IS professionals, predictive validation through twenty four experts and blind validation through nine employees working in IT field.The results show that the proposed system was found to be run without any errors, offered a friendly user interface and its suggestions matching user expectations with 95.8%. It also can help project managers, systems' engineers, systems' developers, consultants, and planners in the process of selecting the suitable ISDM. Finally, the results show that the proposed Rule-based Expert System can facilities the selection process especially for new users and non-specialist in Information System field.

  3. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    Directory of Open Access Journals (Sweden)

    Górny Z.

    2015-04-01

    Full Text Available Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory research. It also discloses the methodology used in the creation of a knowledge base.

  4. 基于置信规则库推理的多属性双边匹配决策方法%Belief rule base inference methodology for two-sided matching decision with multi-attribute

    Institute of Scientific and Technical Information of China (English)

    方志坚; 杨隆浩; 傅仰耿; 陈建华

    2016-01-01

    This thesis presents a tentative study on a new two-sided matching approach,which is proposed to solve the two-sided matching problem with uncertain information and multiple attributes.The multi-attributes matching decision making(MAMDM)problem is one of the most important key points in the two-sided matching study,which has evoked great attention for the scholars in recent years.A belief rule-base inference methodology using the evidence reasoning approach(RIMER)has been introduced in this thesis to solve the problem of MAMDM.At the beginning of this thesis,the authors explain the reason why they choose to use belief degree.The current research on the problem of MAMDM is mainly restricted to the study of a kind of two-sided matching,whose evaluation information is linguistic values or interval values.But there exists a lack of study in belief degree as evaluation value. As belief degree can be used to deal with different kinds of uncertain and incomplete information,using it as evaluation value may trigger a new breakthrough in the study of MAMDM.Through the analysis of simulation ex-periments datas and the application of RIMER,belief degrees evaluation information is converted into different levels of confidence information.Then a 0-1 programming model is built by making use of different levels of confidence information to obtain a final matching scheme.It is also pointed out in the thesis that an output error may be caused when BRB(belief rule-base)input is higher than threshold value.To solve this problem,the authors propose that the input value can be incorporated into the uncertainty by the adoption of cutting method.If cutting method is not suitable,linear mapping method can be applied to reduce the influence of the results.The case study analysis shows that it is feasible and effective to adopt the new proposed approach to solve the problem of multi-attributes matching decision making.%针对具有不确定信息的多属性双边匹配决策问题,引

  5. Methods for testing of geometrical down-scaled rotor blades

    DEFF Research Database (Denmark)

    Branner, Kim; Berring, Peter

    as requirements for experimental facilities are very demanding and furthermore the time for performing the experimental test campaign and the cost are not well suitable for most research projects. This report deals with the advantages, disadvantages and open questions of using down-scaled testing on wind turbine...

  6. Rule-Based Semantic Sensing

    CERN Document Server

    Woznowski, Przemyslaw

    2011-01-01

    Rule-Based Systems have been in use for decades to solve a variety of problems but not in the sensor informatics domain. Rules aid the aggregation of low-level sensor readings to form a more complete picture of the real world and help to address 10 identified challenges for sensor network middleware. This paper presents the reader with an overview of a system architecture and a pilot application to demonstrate the usefulness of a system integrating rules with sensor middleware.

  7. A Belief Rule-Based Expert System to Diagnose Influenza

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin; Akter, Shamima

    2014-01-01

    ). The RIMER approach can handle different types of uncertainties, both in knowledge representation, and in inference procedures. The knowledge-base of this system was constructed by using records of the real patient data along with in consultation with the Influenza specialists of Bangladesh. Practical case......, development and application of an expert system to diagnose influenza under uncertainty. The recently developed generic belief rule-based inference methodology by using the evidential reasoning (RIMER) approach is employed to develop this expert system, termed as Belief Rule Based Expert System (BRBES...... studies were used to validate the BRBES. The system generated results are effective and reliable than from manual system in terms of accuracy....

  8. A Constructivist Approach to Rule Bases

    NARCIS (Netherlands)

    Sileno, G.; Boer, A.; van Engers, T.; Loiseau, S.; Filipe, J.; Duval, B.; van den Herik, J.

    2015-01-01

    The paper presents a set of algorithms for the conversion of rule bases between priority-based and constraint-based representations. Inspired by research in precedential reasoning in law, such algorithms can be used for the analysis of a rule base, and for the study of the impact of the introduction

  9. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic progra...

  10. Rule-based Modelling and Tunable Resolution

    Directory of Open Access Journals (Sweden)

    Russ Harmer

    2009-11-01

    Full Text Available We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  11. Rule-based Modelling and Tunable Resolution

    CERN Document Server

    Harmer, Russ

    2009-01-01

    We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  12. Evolving Rule-Based Systems in two Medical Domains using Genetic Programming

    DEFF Research Database (Denmark)

    Tsakonas, A.; Dounias, G.; Jantzen, Jan

    2004-01-01

    We demonstrate, compare and discuss the application of two genetic programming methodologies for the construction of rule-based systems in two medical domains: the diagnosis of Aphasia's subtypes and the classification of Pap-Smear Test examinations. The first approach consists of a scheme...... that combines genetic programming and heuristic hierarchical crisp rule-base construction. The second model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results are also compared for their efficiency, accuracy and comprehensibility, to those...... of a standard entropy based machine learning approach and to those of a standard genetic programming symbolic expression approach. In the diagnosis of subtypes of Aphasia, two models for crisp rule-bases are presented. The first one discriminates between four major types and the second attempts...

  13. Rule-Based Optimization of Reversible Circuits

    CERN Document Server

    Arabzadeh, Mona; Zamani, Morteza Saheb

    2010-01-01

    Reversible logic has applications in various research areas including low-power design and quantum computation. In this paper, a rule-based optimization approach for reversible circuits is proposed which uses both negative and positive control Toffoli gates during the optimization. To this end, a set of rules for removing NOT gates and optimizing sub-circuits with common-target gates are proposed. To evaluate the proposed approach, the best-reported synthesized circuits and the results of a recent synthesis algorithm which uses both negative and positive controls are used. Our experiments reveal the potential of the proposed approach in optimizing synthesized circuits.

  14. Rule-Based Network Service Provisioning

    Directory of Open Access Journals (Sweden)

    Rudy Deca

    2012-10-01

    Full Text Available Due to the unprecedented development of networks, manual network service provisioning is becoming increasingly risky, error-prone, expensive, and time-consuming. To solve this problem,rule-based methods can provide adequate leverage for automating various network management tasks. This paper presents a rule-based solution for automated network service provisioning. The proposed approach captures configuration data interdependencies using high-level, service-specific, user-configurable rules. We focus on the service validation task, which is illustrated by means of a case study.Based on numerical results, we analyse the influence of the network-level complexity factors and rule descriptive features on the rule efficiency. This analysis shows the operators how to increase rule efficiency while keeping the rules simple and the rule set compact. We present a technique that allows operators to increase the error coverage, and we show that high error coverage scales well when the complexity of networks and services increases.We reassess the correlation function between specific rule efficiency and rule complexity metrics found in previous work, and show that this correlation function holds for various sizes, types, and complexities of networks and services.

  15. Fuzzification of ASAT's rule based aimpoint selection

    Science.gov (United States)

    Weight, Thomas H.

    1993-06-01

    The aimpoint algorithms being developed at Dr. Weight and Associates are based on the concept of fuzzy logic. This approach does not require a particular type of sensor data or algorithm type, but allows the user to develop a fuzzy logic algorithm based on existing aimpoint algorithms and models. This provides an opportunity for the user to upgrade an existing system design to achieve higher performance at minimal cost. Many projects have aimpoint algorithms which are based on 'crisp' logic rule based algorithms. These algorithms are sensitive to glint, corner reflectors, or intermittent thruster firings, and to uncertainties in the a priori estimates of angle of attack. If these projects are continued through to a demonstration involving a launch to hit a target, it is quite possible that the crisp logic approaches will need to be upgraded to handle these important error sources.

  16. Rule Based Shallow Parser for Arabic Language

    Directory of Open Access Journals (Sweden)

    Mona A. Mohammed

    2011-01-01

    Full Text Available Problem statement: One of language processing approaches that compute a basic analysis of sentence structure rather than attempting full syntactic analysis is shallow syntactic parsing. It is an analysis of a sentence which identifies the constituents (noun groups, verb groups, prepositional groups, but does not specify their internal structure, nor their role in the main sentence. The only technique used for Arabic shallow parser is Support Vector Machine (SVM based approach. The problem faced by shallow parser developers is the boundary identification which is applied to ensure the generation of high accuracy system performance. Approach: The specific objective of the research was to identify the entire Noun Phrases (NPs, Verb Phrases (VPs and Prepositional Phrases (PPs boundaries in the Arabic language. This study discussed various idiosyncrasies of Arabic sentences to derive more accurate rules to detect start and the end boundaries of each clause in an Arabic sentence. New rules were proposed to the shallow parser features up to the generation of two levels from full parse-tree. We described an implementation and evaluate the rule-based shallow parser that handles chunking of Arabic sentences. This research was based on a critical analysis of the Arabic sentences architecture. It discussed various idiosyncrasies of Arabic sentences to derive more accurate rules to detect the start and the end boundaries of each clause in an Arabic sentence. Results: The system was tested manually on 70 Arabic sentences which composed of 1776 words, with the length of the sentences between 4-50 words. The result obtained was significantly better than state of the art Arabic published results, which achieved F-scores of 97%. Conclusion: The main achievement includes the development of Arabic shallow parser based on rule-based approaches. Chunking which constitutes the main contribution is achieved on two successive stages that include grouped sequences of

  17. A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences

    Science.gov (United States)

    Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis

    In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.

  18. Evolving Rule-Based Systems in two Medical Domains using Genetic Programming

    DEFF Research Database (Denmark)

    Tsakonas, A.; Dounias, G.; Jantzen, Jan;

    2004-01-01

    We demonstrate, compare and discuss the application of two genetic programming methodologies for the construction of rule-based systems in two medical domains: the diagnosis of Aphasia's subtypes and the classification of Pap-Smear Test examinations. The first approach consists of a scheme...... the classification between all common types. A third model consisting of a GP-generated fuzzy rule-based system is tested on the same field. In the classification of Pap-Smear Test examinations, a crisp rule-based system is constructed. Results denote the effectiveness of the proposed systems. Comments...... and comparisons are made between the proposed methods and previous attempts on the selected fields of application....

  19. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan;

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic progra...... systems. Comparisons on the system's comprehensibility and the transparency are included. These comparisons include for the Aphasia domain, previous work consisted of two neural network models....

  20. A C++ Class for Rule-Base Objects

    Directory of Open Access Journals (Sweden)

    William J. Grenney

    1992-01-01

    Full Text Available A C++ class, called Tripod, was created as a tool to assist with the development of rule-base decision support systems. The Tripod class contains data structures for the rule-base and member functions for operating on the data. The rule-base is defined by three ASCII files. These files are translated by a preprocessor into a single file that is located when a rule-base object is instantiated. The Tripod class was tested as part of a proto-type decision support system (DSS for winter highway maintenance in the Intermountain West. The DSS is composed of two principal modules: the main program, called the wrapper, and a Tripod rule-base object. The wrapper is a procedural module that interfaces with remote sensors and an external meterological database. The rule-base contains the logic for advising an inexperienced user and for assisting with the decision making process.

  1. Fuzzy Rule Base System for Software Classification

    Directory of Open Access Journals (Sweden)

    Adnan Shaout

    2013-07-01

    Full Text Available Given the central role that software development plays in the delivery and application of informationtechnology, managers have been focusing on process improvement in the software development area. Thisimprovement has increased the demand for software measures, or metrics to manage the process. Thismetrics provide a quantitative basis for the development and validation of models during the softwaredevelopment process. In this paper a fuzzy rule-based system will be developed to classify java applicationsusing object oriented metrics. The system will contain the following features:Automated method to extract the OO metrics from the source code,Default/base set of rules that can be easily configured via XML file so companies, developers, teamleaders,etc, can modify the set of rules according to their needs,Implementation of a framework so new metrics, fuzzy sets and fuzzy rules can be added or removeddepending on the needs of the end user,General classification of the software application and fine-grained classification of the java classesbased on OO metrics, andTwo interfaces are provided for the system: GUI and command.

  2. Automatic Induction of Rule Based Text Categorization

    Directory of Open Access Journals (Sweden)

    D.Maghesh Kumar

    2010-12-01

    Full Text Available The automated categorization of texts into predefined categories has witnessed a booming interest in the last 10 years, due to the increased availability of documents in digital form and the ensuingneed to organize them. In the research community the dominant approach to this problem is based on machine learning techniques: a general inductive process automatically builds a classifier by learning, from a set of preclassified documents, the characteristics of the categories. This paper describes, a novel method for the automatic induction of rule-based text classifiers. This method supports a hypothesis language of the form "if T1, … or Tn occurs in document d, and none of T1+n,... Tn+m occurs in d, then classify d under category c," where each Ti is a conjunction of terms. This survey discusses the main approaches to text categorization that fall within the machine learning paradigm. Issues pertaining tothree different problems, namely, document representation, classifier construction, and classifier evaluation were discussed in detail.

  3. A high-speed mixed-signal down-scaling circuit for DAB tuners

    Institute of Scientific and Technical Information of China (English)

    Tang Lu; Wang Zhigong; Xuan Jiahui; Yang Yang; Xu Jian; Xu Yong

    2012-01-01

    A high-speed mixed-signal down-scaling circuit with low power consumption and low phase noise for use in digital audio broadcasting tuners has been realized and characterized.Some new circuit techniques are adopted to improve its performance,A dual-modulus prescaler (DMP) with low phase noise is realized with a kind of improved source-coupled logic (SCL) D-flip-flop (DFF) in the synchronous divider and a kind of improved complementary metal oxide semiconductor master-slave (CMOS MS)-DFF in the asynchronous divider.A new more accurate wire-load model is used to realize the pulse-swallow counter (PS counter).Fabricated in a 0.18-μm CMOS process,the total chip size is 0.6 × 0.2 mm2.The DMP in the proposed down-scaling circuit exhibits a low phase noise of-118.2 dBc/Hz at 10 kHz off the carrier frequency.At a supply voltage of 1.8 V,the power consumption of the down-scaling circuit's core part is only 2.7 mW.

  4. A high-speed mixed-signal down-scaling circuit for DAB tuners

    Science.gov (United States)

    Lu, Tang; Zhigong, Wang; Jiahui, Xuan; Yang, Yang; Jian, Xu; Yong, Xu

    2012-07-01

    A high-speed mixed-signal down-scaling circuit with low power consumption and low phase noise for use in digital audio broadcasting tuners has been realized and characterized. Some new circuit techniques are adopted to improve its performance. A dual-modulus prescaler (DMP) with low phase noise is realized with a kind of improved source-coupled logic (SCL) D-flip-flop (DFF) in the synchronous divider and a kind of improved complementary metal oxide semiconductor master-slave (CMOS MS)-DFF in the asynchronous divider. A new more accurate wire-load model is used to realize the pulse-swallow counter (PS counter). Fabricated in a 0.18-μm CMOS process, the total chip size is 0.6 × 0.2 mm2. The DMP in the proposed down-scaling circuit exhibits a low phase noise of -118.2 dBc/Hz at 10 kHz off the carrier frequency. At a supply voltage of 1.8 V, the power consumption of the down-scaling circuit's core part is only 2.7 mW.

  5. Rule based systems for big data a machine learning approach

    CERN Document Server

    Liu, Han; Cocea, Mihaela

    2016-01-01

    The ideas introduced in this book explore the relationships among rule based systems, machine learning and big data. Rule based systems are seen as a special type of expert systems, which can be built by using expert knowledge or learning from real data. The book focuses on the development and evaluation of rule based systems in terms of accuracy, efficiency and interpretability. In particular, a unified framework for building rule based systems, which consists of the operations of rule generation, rule simplification and rule representation, is presented. Each of these operations is detailed using specific methods or techniques. In addition, this book also presents some ensemble learning frameworks for building ensemble rule based systems.

  6. The Algorithm for Rule-base Refinement on Fuzzy Set

    Institute of Scientific and Technical Information of China (English)

    LI Feng; WU Cui-hong; DING Xiang-wu

    2006-01-01

    In the course of running an artificial intelligent system many redundant rules are often produced. To refine the knowledge base, viz. to remove the redundant rules, can accelerate the reasoning and shrink the rule base. The purpose of the paper is to present the thinking on the topic and design the algorithm to remove the redundant rules from the rule base.The "abstraction" of "state variable", redundant rules and the least rule base are discussed in the paper. The algorithm on refining knowledge base is also presented.

  7. Reduced rule base self-tuning fuzzy PI controller for TCSC

    Energy Technology Data Exchange (ETDEWEB)

    Hameed, Salman; Das, Biswarup; Pant, Vinay [Department of Electrical Engineering, Indian Institute of Technology, Roorkee, Roorkee - 247 667, Uttarakhand (India)

    2010-11-15

    In this paper, a reduced rule base self-tuning fuzzy PI controller (STFPIC) for thyristor controlled series capacitor (TCSC) is proposed. Essentially, a STFPIC consists of two fuzzy logic controllers (FLC). In this work, for each FLC, 49 rules have been used and as a result, the overall complexity of the STFPIC increases substantially. To reduce this complexity, application of singular value decomposition (SVD) based rule reduction technique is also proposed in this paper. By applying this methodology, the number of rules in each FLC has been reduced from 49 to 9. Therefore, the proposed rule base reduction technique reduces the total number of rules in the STFPIC by almost 80% (from 49 x 2 = 98 to 9 x 2 = 18), thereby reducing the complexity of the STFPIC significantly. The feasibility of the proposed algorithm has been tested on 2-area 4-machine power system and 10-machine 39-bus system through detailed digital simulation using MATLAB/SIMULINK. (author)

  8. Direct Down-scale Experiments of Concentration Column Designs for SHINE Process

    Energy Technology Data Exchange (ETDEWEB)

    Youker, Amanda J. [Argonne National Lab. (ANL), Argonne, IL (United States); Stepinski, Dominique C. [Argonne National Lab. (ANL), Argonne, IL (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-01

    Argonne is assisting SHINE Medical Technologies in their efforts to become a domestic Mo-99 producer. The SHINE accelerator-driven process uses a uranyl-sulfate target solution for the production of fission-product Mo-99. Argonne has developed a molybdenum recovery and purification process for this target solution. The process includes an initial Mo recovery column followed by a concentration column to reduce the product volume from 15-25 L to < 1 L prior to entry into the LEU Modified Cintichem (LMC) process for purification.1 This report discusses direct down-scale experiments of the plant-scale concentration column design, where the effects of loading velocity and temperature were investigated.

  9. Design High Efficiency-Minimum Rule Base PID Like Fuzzy Computed Torque Controller

    Directory of Open Access Journals (Sweden)

    Alireza Khalilian

    2014-06-01

    Full Text Available The minimum rule base Proportional Integral Derivative (PID Fuzzy Computed Torque Controller is presented in this research. The popularity of PID Fuzzy Computed Torque Controller can be attributed to their robust performance in a wide range of operating conditions and partly to their functional simplicity. The process of setting of PID Fuzzy Computed Torque Controller can be determined as an optimization task. Over the years, use of intelligent strategies for tuning of these controllers has been growing. PID methodology has three inputs and if any input is described with seven linguistic values, and any rule has three conditions we will need 343 rules. It is too much work to write 343 rules. In this research the PID-like fuzzy controller can be constructed as a parallel structure of a PD-like fuzzy controller and a PI controller to have the minimum rule base. However computed torque controller is work based on cancelling decoupling and nonlinear terms of dynamic parameters of each link, this controller is work based on manipulator dynamic model and this technique is highly sensitive to the knowledge of all parameters of nonlinear robot manipulator’s dynamic equation. This research is used to reduce or eliminate the computed torque controller problem based on minimum rule base fuzzy logic theory to control of flexible robot manipulator system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  10. A rule-based Afan Oromo Grammar Checker

    Directory of Open Access Journals (Sweden)

    Debela Tesfaye

    2011-08-01

    Full Text Available Natural language processing (NLP is a subfield of computer science, with strong connections to artificial intelligence. One area of NLP is concerned with creating proofing systems, such as grammar checker. Grammar checker determines the syntactical correctness of a sentence which is mostly used in word processors and compilers. For languages, such as Afan Oromo, advanced tools have been lacking and are still in the early stages. In this paper a rule based grammar checker is presented. The rule base is entirely developed and dependent on the morphology of the language . The checker is evaluated and shown a promising result.

  11. The evolution of down-scale virus filtration equipment for virus clearance studies.

    Science.gov (United States)

    Wieser, Andreas; Berting, Andreas; Medek, Christian; Poelsler, Gerhard; Kreil, Thomas R

    2015-03-01

    The role of virus filtration in assuring the safety of biopharmaceutical products has gained importance in recent years. This is due to the fundamental advantages of virus filtration, which conceptually can remove all pathogens as long as their size is larger than the biomolecule of commercial interest, while at the same time being neutral to the biological activity of biopharmaceutical compound(s). Major progress has been made in the development of adequate filtration membranes that can remove even smaller viruses, or possibly even all. Establishing down-scaled models for virus clearance studies that are fully equivalent with respect to operating parameters at manufacturing scale is a continuing challenge. This is especially true for virus filtration procedures where virus clearance studies at small-scale determine the operating parameters, which can be used at manufacturing scale. This has limited volume-to-filter-area-ratios, with significant impact on process economics. An advanced small-scale model of virus filtration, which allows the investigation of the full complexity of these processes, is described here. It includes the automated monitoring and control of all process parameters, as well as an electronic data acquisition system, which is fully compliant with current regulatory requirements for electronic records in a pharmaceutical environment.

  12. ARABIC PERSON NAMES RECOGNITION BY USING A RULE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Mohammed Aboaoga

    2013-01-01

    Full Text Available Name Entity Recognition is very important task in many natural language processing applications such as; Machine Translation, Question Answering, Information Extraction, Text Summarization, Semantic Applications and Word Sense Disambiguation. Rule-based approach is one of the techniques that are used for named entity recognition to identify the named entities such as a person names, location names and organization names. The recent rule-based methods have been applied to recognize the person names in political domain. They ignored the recognition of other named entity types such as locations and organizations. We have used the rule based approach for recognizing the named entity type (person names for Arabic. We have developed four rules for identifying the person names depending on the position of name. We have used an in-house Arabic corpus collected from newspaper achieves. The evaluation method that compares the results of the system with the manually annotated text has been applied in order to compute precision, recall and f-measure. In the experiment of this study, the average f-measure for recognizing person names are (92.66, 92.04 and 90.43% in sport, economic and politic domain respectively. The experimental results showed that our rule-based method achieved the highest f-measure values in sport domain comparing with political and economic domains.

  13. Ruled-based control of off-grid electrolysis

    DEFF Research Database (Denmark)

    Serna, A.; Tadeo, F.; Normey-Rico, J. E.

    2016-01-01

    This work deals with a ruled-based control strategy to produce hydrogen from wind and wave energy in an offshore platform. These renewable energies feed a set of alkaline electrolyzers that produce H2. The proposed control system allows regulating the operation of the electrolyzers, taking into a...

  14. A Rule-Based System for Test Quality Improvement

    Science.gov (United States)

    Costagliola, Gennaro; Fuccella, Vittorio

    2009-01-01

    To correctly evaluate learners' knowledge, it is important to administer tests composed of good quality question items. By the term "quality" we intend the potential of an item in effectively discriminating between skilled and untrained students and in obtaining tutor's desired difficulty level. This article presents a rule-based e-testing system…

  15. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  16. Regional and urban down scaling of global climate scenarios for health impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Garzon, A.; Palacios, M.

    2015-07-01

    In this contribution we have used global climate RCP IPCC scenarios to produce climate and air pollution maps at regional (25 km resolution) and urban scale with 200 m spatial resolution over Europe and five European cities in order to investigate the impact on meteorological variables and pollutant concentrations . We have used the very well known mesoscale meteorological model WRF-Chem (NOAA, US). We have used 2011 as control past year and two RCP scenarios from CCSM global climate model with 4.5 W/m2 and 8.5 W/m2 for 2030, 2050 and 2100 years. After running WRF-Chem model, using the boundary conditions provided by RCP scenarios with the emissions of 2011, we have performed a detailed down scaling process using CALMET diagnostic model to obtain a full 200 m spatial resolution map of five European cities (London, Antwerp, Madrid, Milan, and Helsinki). We will show the results and the health impacts for future RCP IPCC climate scenarios in comparison with the 2011 control year information for climate and health indicators. Finally, we have also investigated the impact of the aerosol effects in the short wave radiation mean value. Two simulations with the WRF-Chem model have been performed over Europe in 2010. A baseline simulation without any feedback effects and a second simulation including the direct effects affecting the solar radiation reaching the surface as well as the indirect aerosol effect with potential impacts on increasing or decreasing the precipitation rates. Aerosol effects produce an increase of incoming radiation over Atlantic Ocean (up to 70%) because the prescribed aerosol concentrations in the WRF-Chem without feedbacks is substantially higher than the aerosol concentrations produced when we activate the feedback effects. The decrease in solar radiation in the Sahara area (10%) is found to be produced because the prescribed aerosol concentration in the no feedback simulation is lower than when we activate the feedback effects. (Author)

  17. Rule-bases construction through self-learning for a table-based Sugeno-Takagi fuzzy logic control system

    Directory of Open Access Journals (Sweden)

    C. Boldisor

    2009-12-01

    Full Text Available A self-learning based methodology for building the rule-base of a fuzzy logic controller (FLC is presented and verified, aiming to engage intelligent characteristics to a fuzzy logic control systems. The methodology is a simplified version of those presented in today literature. Some aspects are intentionally ignored since it rarely appears in control system engineering and a SISO process is considered here. The fuzzy inference system obtained is a table-based Sugeno-Takagi type. System’s desired performance is defined by a reference model and rules are extracted from recorded data, after the correct control actions are learned. The presented algorithm is tested in constructing the rule-base of a fuzzy controller for a DC drive application. System’s performances and method’s viability are analyzed.

  18. GRAMMAR RULE BASED INFORMATION RETRIEVAL MODEL FOR BIG DATA

    Directory of Open Access Journals (Sweden)

    T. Nadana Ravishankar

    2015-07-01

    Full Text Available Though Information Retrieval (IR in big data has been an active field of research for past few years; the popularity of the native languages presents a unique challenge in big data information retrieval systems. There is a need to retrieve information which is present in English and display it in the native language for users. This aim of cross language information retrieval is complicated by unique features of the native languages such as: morphology, compound word formations, word spelling variations, ambiguity, word synonym, other language influence and etc. To overcome some of these issues, the native language is modeled using a grammar rule based approach in this work. The advantage of this approach is that the native language is modeled and its unique features are encoded using a set of inference rules. This rule base coupled with the customized ontological system shows considerable potential and is found to show better precision and recall.

  19. Hybrid mathematical and rule-based system for transmission network planning in open access schemes

    Energy Technology Data Exchange (ETDEWEB)

    Kandil, M. S. [Electrical Department, Mansura University, (Egypt); EI-Debeiky, S. M. [Electrical Department, Ain Shams University, (Egypt); Hasanien, N. E. [Egyptian Electricity Authority, Studies and Researches Department, (Egypt)

    2001-09-01

    The paper presents a planning methodology using an application of a mathematical and a rule-based expert system (ES) to expand the transmission network in open access schemes. In this methodology, the ES suggests a realistic set of generation additions with proper economic signals to the participants, before proceeding with the transmission expansion. A feasible list of transmission alternatives is then assumed to accommodate the proposals for generation. A mathematical method is performed based on marginal cost allocation to optimise the location for the new generation and its transmission expansion scheme simultaneously for each alternative. The optimum alternative, which minimises the overall system's cost function and satisfies the future demand under different operating conditions, is obtained. The ES interacts with the power system planning tools to produce the optimum expansion plan. A practical application is given to demonstrate the effectiveness of the developed prototype system. (Author)

  20. A New Approach To The Rule-Based Systems Design And Implementation Process

    Directory of Open Access Journals (Sweden)

    Grzegorz J. Nalepa

    2004-01-01

    Full Text Available The paper discusses selected problems encountered in practical rule-based systems (RBS design and implementation. To solve them XTT, a new visual knowledge representation is introduced. Then a complete, integrated RBS design, implementation and analysis methodology is presented. This methodology is supported by a visual CASE tool called Mirella.The main goal is to move the design procedure to a more abstract, logical level, where knowledge specification is based on use of abstract rule representation. The design specification is automatically translated into Prolog code, so the designer can focus on logical specification of safety and reliability. On the other hand, system formal aspects are automatically verified on-line during the design, so that its verifiable characteristics are preserved.

  1. A New Approach to the Rule-Based Systems Design and Implementation Process

    Directory of Open Access Journals (Sweden)

    Grzegorz J. Nalepa

    2004-01-01

    Full Text Available The paper discusses selected problems encountered in practical rule-based systems (RBS design and implementation. To solve them XTT, a new visual knowledge representation is introduced. Then a complete, integrated RBS design, implementation and analysis methodology is presented. This methodology is supported by a visual CASE tool called Mirella. The main goal is to move the design procedure to a more abstract, logical level, where knowledge specification is based on use of abstract rule representation. The design specification is automatically translated into Prolog code, so the designer can focus on logical specification of safety and reliability. On the other hand, system formal aspects are automatically verified on-line during the design, so that its verifiable characteristics are preserved.

  2. Designing Fuzzy Rule Based Expert System for Cyber Security

    OpenAIRE

    Goztepe, Kerim

    2016-01-01

    The state of cyber security has begun to attract more attention and interest outside the community of computer security experts. Cyber security is not a single problem, but rather a group of highly different problems involving different sets of threats. Fuzzy Rule based system for cyber security is a system consists of a rule depository and a mechanism for accessing and running the rules. The depository is usually constructed with a collection of related rule sets. The aim of this study is to...

  3. An Embedded Rule-Based Diagnostic Expert System in Ada

    Science.gov (United States)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  4. Simulation of large-scale rule-based models

    Energy Technology Data Exchange (ETDEWEB)

    Hlavacek, William S [Los Alamos National Laboratory; Monnie, Michael I [Los Alamos National Laboratory; Colvin, Joshua [NON LANL; Faseder, James [NON LANL

    2008-01-01

    Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein-protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of STOCHSIM. DYNSTOC differs from STOCHSIM by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at .

  5. RULE-BASED SENTIMENT ANALYSIS OF UKRAINIAN REVIEWS

    Directory of Open Access Journals (Sweden)

    Mariana Romanyshyn

    2013-07-01

    Full Text Available Last decade witnessed a lot of research in the field of sentiment analysis. Understanding the attitude and the emotions that people express in written text proved to be really important and helpful in sociology, political science, psychology, market research, and, of course, artificial intelligence. This paper demonstrates a rule-based approach to clause-level sentiment analysis of reviews in Ukrainian. The general architecture of the implemented sentiment analysis system is presented, the current stage of research is described and further work is explained. The main emphasis is made on the design of rules for computing sentiments.

  6. Rules-based object-relational databases ontology construction

    Institute of Scientific and Technical Information of China (English)

    Chen Jia; Wu Yue

    2009-01-01

    To solve the problems of sharing and reusing information in the information system, a rules-based ontology constructing approach from object-relational databases is proposed. A 3-tuple ontology constructing model is proposed first. Then, four types of ontology constructing rules including class, property, property characteristics, and property restrictions axe formalized affording to the model. Experiment results described in Web ontology language prove that our proposed approach is feasible for applying in the semantic objects project of semantic computing laboratory in UC Irvine. Our approach reduces about twenty percent constructing time compared with the ontology construction from relational databases.

  7. Acceleration of association‐rule based markov decision processes

    Directory of Open Access Journals (Sweden)

    Ma. de G. García‐Hernández

    2009-12-01

    Full Text Available In this paper, we present a new approach for the estimation of Markov decision processes based on efficient association rulemining techniques such as Apriori. For the fastest solution of the resulting association‐rule based Markov decision process,several accelerating procedures such as asynchronous updates and prioritization using a static ordering have been applied. Anew criterion for state reordering in decreasing order of maximum reward is also compared with a modified topologicalreordering algorithm. Experimental results obtained on a finite state and action‐space stochastic shortest path problemdemonstrate the feasibility of the new approach.

  8. Uncertain rule-based fuzzy systems introduction and new directions

    CERN Document Server

    Mendel, Jerry M

    2017-01-01

    The second edition of this textbook provides a fully updated approach to fuzzy sets and systems that can model uncertainty — i.e., “type-2” fuzzy sets and systems. The author demonstrates how to overcome the limitations of classical fuzzy sets and systems, enabling a wide range of applications from time-series forecasting to knowledge mining to control. In this new edition, a bottom-up approach is presented that begins by introducing classical (type-1) fuzzy sets and systems, and then explains how they can be modified to handle uncertainty. The author covers fuzzy rule-based systems – from type-1 to interval type-2 to general type-2 – in one volume. For hands-on experience, the book provides information on accessing MatLab and Java software to complement the content. The book features a full suite of classroom material. Presents fully updated material on new breakthroughs in human-inspired rule-based techniques for handling real-world uncertainties; Allows those already familiar with type-1 fuzzy se...

  9. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  10. Rule-based semantic web services matching strategy

    Science.gov (United States)

    Fan, Hong; Wang, Zhihua

    2011-12-01

    With the development of Web services technology, the number of service increases rapidly, and it becomes a challenge task that how to efficiently discovery the services that exactly match the user's requirements from the large scale of services library. Many semantic Web services discovery technologies proposed by the recent literatures only focus on the keyword-based or primary semantic based service's matching. This paper studies the rules and rule reasoning based service matching algorithm in the background of large scale services library. Firstly, the formal descriptions of semantic web services and service matching is presented. The services' matching are divided into four levels: Exact, Plugin, Subsume and Fail and their formal descriptions are also presented. Then, the service matching is regarded as rule-based reasoning issues. A set of match rules are firstly given and the related services set is retrieved from services ontology base through rule-based reasoning, and their matching levels are determined by distinguishing the relationships between service's I/O and user's request I/O. Finally, the experiment based on two services sets show that the proposed services matching strategy can easily implement the smart service discovery and obtains the high service discovery efficiency in comparison with the traditional global traversal strategy.

  11. Integration of Rule Based Expert Systems and Case Based Reasoning in an Acute Bacterial Meningitis Clinical Decision Support System

    CERN Document Server

    Cabrera, Mariana Maceiras

    2010-01-01

    This article presents the results of the research carried out on the development of a medical diagnostic system applied to the Acute Bacterial Meningitis, using the Case Based Reasoning methodology. The research was focused on the implementation of the adaptation stage, from the integration of Case Based Reasoning and Rule Based Expert Systems. In this adaptation stage we use a higher level RBC that stores and allows reutilizing change experiences, combined with a classic rule-based inference engine. In order to take into account the most evident clinical situation, a pre-diagnosis stage is implemented using a rule engine that, given an evident situation, emits the corresponding diagnosis and avoids the complete process.

  12. An Estimation of Distribution Algorithm with Intelligent Local Search for Rule-based Nurse Rostering

    CERN Document Server

    Uwe, Aickelin; Jingpeng, Li

    2007-01-01

    This paper proposes a new memetic evolutionary algorithm to achieve explicit learning in rule-based nurse rostering, which involves applying a set of heuristic rules for each nurse's assignment. The main framework of the algorithm is an estimation of distribution algorithm, in which an ant-miner methodology improves the individual solutions produced in each generation. Unlike our previous work (where learning is implicit), the learning in the memetic estimation of distribution algorithm is explicit, i.e. we are able to identify building blocks directly. The overall approach learns by building a probabilistic model, i.e. an estimation of the probability distribution of individual nurse-rule pairs that are used to construct schedules. The local search processor (i.e. the ant-miner) reinforces nurse-rule pairs that receive higher rewards. A challenging real world nurse rostering problem is used as the test problem. Computational results show that the proposed approach outperforms most existing approaches. It is ...

  13. Towards a Generic Trace for Rule Based Constraint Reasoning

    CERN Document Server

    Junior, Armando Gonçalves Da Silva; Menezes, Luis-Carlos; Da Silva, Marcos-Aurélio Almeida; Robin, Jacques

    2012-01-01

    CHR is a very versatile programming language that allows programmers to declaratively specify constraint solvers. An important part of the development of such solvers is in their testing and debugging phases. Current CHR implementations support those phases by offering tracing facilities with limited information. In this report, we propose a new trace for CHR which contains enough information to analyze any aspects of \\CHRv\\ execution at some useful abstract level, common to several implementations. %a large family of rule based solvers. This approach is based on the idea of generic trace. Such a trace is formally defined as an extension of the $\\omega_r^\\lor$ semantics of CHR. We show that it can be derived form the SWI Prolog CHR trace.

  14. A Rule Based System for Speech Language Context Understanding

    Institute of Scientific and Technical Information of China (English)

    Imran Sarwar Bajwa; Muhammad Abbas Choudhary

    2006-01-01

    Speech or Natural language contents are major tools of communication. This research paper presents a natural language processing based automated system for understanding speech language text. A new rule based model has been presented for analyzing the natural languages and extracting the relative meanings from the given text. User writes the natural language text in simple English in a few paragraphs and the designed system has a sound ability of analyzing the given script by the user. After composite analysis and extraction of associated information, the designed system gives particular meanings to an assortment of speech language text on the basis of its context. The designed system uses standard speech language rules that are clearly defined for all speech languages as English,Urdu, Chinese, Arabic, French, etc. The designed system provides a quick and reliable way to comprehend speech language context and generate respective meanings.

  15. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan

    2017-05-24

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  16. Grapheme-color synaesthesia benefits rule-based Category learning.

    Science.gov (United States)

    Watson, Marcus R; Blair, Mark R; Kozik, Pavel; Akins, Kathleen A; Enns, James T

    2012-09-01

    Researchers have long suspected that grapheme-color synaesthesia is useful, but research on its utility has so far focused primarily on episodic memory and perceptual discrimination. Here we ask whether it can be harnessed during rule-based Category learning. Participants learned through trial and error to classify grapheme pairs that were organized into categories on the basis of their associated synaesthetic colors. The performance of synaesthetes was similar to non-synaesthetes viewing graphemes that were physically colored in the same way. Specifically, synaesthetes learned to categorize stimuli effectively, they were able to transfer this learning to novel stimuli, and they falsely recognized grapheme-pair foils, all like non-synaesthetes viewing colored graphemes. These findings demonstrate that synaesthesia can be exploited when learning the kind of material taught in many classroom settings.

  17. Fuzzy rule-based support vector regression system

    Institute of Scientific and Technical Information of China (English)

    Ling WANG; Zhichun MU; Hui GUO

    2005-01-01

    In this paper,we design a fuzzy rule-based support vector regression system.The proposed system utilizes the advantages of fuzzy model and support vector regression to extract support vectors to generate fuzzy if-then rules from the training data set.Based on the first-order linear Tagaki-Sugeno (TS) model,the structure of rules is identified by the support vector regression and then the consequent parameters of rules are tuned by the global least squares method.Our model is applied to the real world regression task.The simulation results gives promising performances in terms of a set of fuzzy rules,which can be easily interpreted by humans.

  18. Efficient mining of association rules based on gravitational search algorithm

    Directory of Open Access Journals (Sweden)

    Fariba Khademolghorani

    2011-07-01

    Full Text Available Association rules mining are one of the most used tools to discover relationships among attributes in a database. A lot of algorithms have been introduced for discovering these rules. These algorithms have to mine association rules in two stages separately. Most of them mine occurrence rules which are easily predictable by the users. Therefore, this paper discusses the application of gravitational search algorithm for discovering interesting association rules. This evolutionary algorithm is based on the Newtonian gravity and the laws of motion. Furthermore, contrary to the previous methods, the proposed method in this study is able to mine the best association rules without generating frequent itemsets and is independent of the minimum support and confidence values. The results of applying this method in comparison with the method of mining association rules based upon the particle swarm optimization show that our method is successful.

  19. A Rule-Based Industrial Boiler Selection System

    Science.gov (United States)

    Tan, C. F.; Khalil, S. N.; Karjanto, J.; Tee, B. T.; Wahidin, L. S.; Chen, W.; Rauterberg, G. W. M.; Sivarao, S.; Lim, T. L.

    2015-09-01

    Boiler is a device used for generating the steam for power generation, process use or heating, and hot water for heating purposes. Steam boiler consists of the containing vessel and convection heating surfaces only, whereas a steam generator covers the whole unit, encompassing water wall tubes, super heaters, air heaters and economizers. The selection of the boiler is very important to the industry for conducting the operation system successfully. The selection criteria are based on rule based expert system and multi-criteria weighted average method. The developed system consists of Knowledge Acquisition Module, Boiler Selection Module, User Interface Module and Help Module. The system capable of selecting the suitable boiler based on criteria weighted. The main benefits from using the system is to reduce the complexity in the decision making for selecting the most appropriate boiler to palm oil process plant.

  20. CRIS: A Rule-Based Approach for Customized Academic Advising

    Directory of Open Access Journals (Sweden)

    Chung-Wei Yeh

    2015-04-01

    Full Text Available This study presents a customized academic e-advising service by using rule-based technology to provide each individual learner for recommending courses for college students in Taiwan. Since academic advising for taking courses is mostly by advisors to assist students to achieve educational, career, and personal goals, which made it important in the higher education system. To enhance the counseling effectiveness for advisors to assist students in fitting their professional field and improve their learning experience, we proposed an application system, called CRIS (course recommendation intelligent system. The CRIS consists of six functions: academic profile review, academic interest analysis, career and curriculum matchmaking, recommend courses analysis, department recommend analysis and record assessment. This work provides the solution in three layers, data layer, processing layer and solution layer, via four steps: (1 database design and data transfer, (2 student profile analysis, (3 customized academic advising generation and (4 solution analysis. A comparison of academic score and the combination of individual students' interest in learning and academic achievement satisfaction survey are conducted to test the effectiveness. The experiment result shows that the participating college students considered that the CRIS helpful in their adjustment to the university and it increased their success at the university.

  1. Rule-Based Storytelling Text-to-Speech (TTS Synthesis

    Directory of Open Access Journals (Sweden)

    Ramli Izzad

    2016-01-01

    Full Text Available In recent years, various real life applications such as talking books, gadgets and humanoid robots have drawn the attention to pursue research in the area of expressive speech synthesis. Speech synthesis is widely used in various applications. However, there is a growing need for an expressive speech synthesis especially for communication and robotic. In this paper, global and local rule are developed to convert neutral to storytelling style speech for the Malay language. In order to generate rules, modification of prosodic parameters such as pitch, intensity, duration, tempo and pauses are considered. Modification of prosodic parameters is examined by performing prosodic analysis on a story collected from an experienced female and male storyteller. The global and local rule is applied in sentence level and synthesized using HNM. Subjective tests are conducted to evaluate the synthesized storytelling speech quality of both rules based on naturalness, intelligibility, and similarity to the original storytelling speech. The results showed that global rule give a better result than local rule

  2. A novel rules based approach for estimating software birthmark.

    Science.gov (United States)

    Nazir, Shah; Shahzad, Sara; Khan, Sher Afzal; Alias, Norma Binti; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark.

  3. Rainfall events prediction using rule-based fuzzy inference system

    Science.gov (United States)

    Asklany, Somia A.; Elhelow, Khaled; Youssef, I. K.; Abd El-wahab, M.

    2011-07-01

    We are interested in rainfall events prediction by applying rule-based reasoning and fuzzy logic. Five parameters: relative humidity, total cloud cover, wind direction, temperature and surface pressure are the input variables for our model, each has three membership functions. The data used is twenty years METAR data for Cairo airport station (HECA) [1972-1992] 30° 3' 29″ N, 31° 13' 44″ E. and five years METAR data for Mersa Matruh station (HEMM) 31° 20' 0″ N, 27° 13' 0″ E. Different models for each station were constructed depending on the available data sets. Among the overall 243 possibilities we have based our models on one hundred eighteen fuzzy IF-THEN rules and fuzzy reasoning. The output variable which has four membership functions, takes values from zero to one hundred corresponding to the percentage for rainfall events given for every hourly data. We used two skill scores to verify our results, the Brier score and the Friction score. The results are in high agreements with the recorded data for the stations with increasing in output values towards the real time rain events. All implementation are done with MATLAB 7.9.

  4. A fuzzy rule based framework for noise annoyance modeling.

    Science.gov (United States)

    Botteldooren, Dick; Verkeyn, Andy; Lercher, Peter

    2003-09-01

    Predicting the effect of noise on individual people and small groups is an extremely difficult task due to the influence of a multitude of factors that vary from person to person and from context to context. Moreover, noise annoyance is inherently a vague concept. That is why, in this paper, it is argued that noise annoyance models should identify a fuzzy set of possible effects rather than seek a very accurate crisp prediction. Fuzzy rule based models seem ideal candidates for this task. This paper provides the theoretical background for building these models. Existing empirical knowledge is used to extract a few typical rules that allow making the model more specific for small groups of individuals. The resulting model is tested on two large-scale social surveys augmented with exposure simulations. The testing demonstrates how this new way of thinking about noise effect modeling can be used in practice both in management support as a "noise annoyance adviser" and in social science for testing hypotheses such as the effect of noise sensitivity or the degree of urbanization.

  5. A rule-based stemmer for Arabic Gulf dialect

    Directory of Open Access Journals (Sweden)

    Belal Abuata

    2015-04-01

    Full Text Available Arabic dialects arewidely used from many years ago instead of Modern Standard Arabic language in many fields. The presence of dialects in any language is a big challenge. Dialects add a new set of variational dimensions in some fields like natural language processing, information retrieval and even in Arabic chatting between different Arab nationals. Spoken dialects have no standard morphological, phonological and lexical like Modern Standard Arabic. Hence, the objective of this paper is to describe a procedure or algorithm by which a stem for the Arabian Gulf dialect can be defined. The algorithm is rule based. Special rules are created to remove the suffixes and prefixes of the dialect words. Also, the algorithm applies rules related to the word size and the relation between adjacent letters. The algorithm was tested for a number of words and given a good correct stem ratio. The algorithm is also compared with two Modern Standard Arabic algorithms. The results showed that Modern Standard Arabic stemmers performed poorly with Arabic Gulf dialect and our algorithm performed poorly when applied for Modern Standard Arabic words.

  6. ARABIC-MALAY MACHINE TRANSLATION USING RULE-BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Ahmed Jumaa Alsaket

    2014-01-01

    Full Text Available Arabic machine translation has been taking place in machine translation projects in recent years. This study concentrates on the translation of Arabic text to its equivalent in Malay language. The problem of this research is the syntactic and morphological differences between Arabic and Malay adjective sentences. The main aim of this study is to design and develop Arabic-Malay machine translation model. First, we analyze the adjective role in the Arabic and Malay languages. Based on this analysis, we identify the transfer bilingual rules form source language to target language so that the translation of source language to target language can be performed by computers successfully. Then, we build and implement a machine translation prototype called AMTS to translate from Arabic to Malay based on rule based approach. The system is evaluated on set of simple Arabic sentences. The techniques used to evaluate the correctness of the system translation are the BLEU metric algorithm and the human judgment. The results of the BLEU algorithm show that the AMTS system performs better than Google in the translation of Arabic sentences into Malay. In addition, the average accuracy given by human judges is 92.3% for our system and 75.3% for Google.

  7. Rule-based deduplication of article records from bibliographic databases.

    Science.gov (United States)

    Jiang, Yu; Lin, Can; Meng, Weiyi; Yu, Clement; Cohen, Aaron M; Smalheiser, Neil R

    2014-01-01

    We recently designed and deployed a metasearch engine, Metta, that sends queries and retrieves search results from five leading biomedical databases: PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Central Register of Controlled Trials. Because many articles are indexed in more than one of these databases, it is desirable to deduplicate the retrieved article records. This is not a trivial problem because data fields contain a lot of missing and erroneous entries, and because certain types of information are recorded differently (and inconsistently) in the different databases. The present report describes our rule-based method for deduplicating article records across databases and includes an open-source script module that can be deployed freely. Metta was designed to satisfy the particular needs of people who are writing systematic reviews in evidence-based medicine. These users want the highest possible recall in retrieval, so it is important to err on the side of not deduplicating any records that refer to distinct articles, and it is important to perform deduplication online in real time. Our deduplication module is designed with these constraints in mind. Articles that share the same publication year are compared sequentially on parameters including PubMed ID number, digital object identifier, journal name, article title and author list, using text approximation techniques. In a review of Metta searches carried out by public users, we found that the deduplication module was more effective at identifying duplicates than EndNote without making any erroneous assignments.

  8. A Novel Rules Based Approach for Estimating Software Birthmark

    Directory of Open Access Journals (Sweden)

    Shah Nazir

    2015-01-01

    Full Text Available Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark.

  9. Feasibility of a down-scaled HEMP-Thruster as possible N-propulsion system for LISA

    Science.gov (United States)

    Keller, A.; Köhler, P.; Gärtner, W.; Hey, F. G.; Berger, M.; Braxmaier, C.; Feili, D.; Weise, D.; Johann, U.

    2013-01-01

    An experimental feasibility study on down-scaling HEMP thrusters to textmu N thrust levels as required e.g. for NGO is presented. Prototypes are used to probe the operation space as well as measuring the divergence angle of the plume and ion acceleration voltage by means of Faraday Cups and a Retarding Potential Analyser in order to gain a deeper understanding of the influence of design parameters. From the measured values thrust and specific impulse are calculated using simple models. Stable operation with a calculated thrust down to 70 N has been demonstrated and divergence efficiencies of about 0.5 are observed. Due to the importance of the thrust value and uncertainties of the models it is clearly desirable to measure the thrust and thrust noise directly with a thrust balance. Such a device is under construction, with a picometer noise level heterodyne interferometer as optical readout.

  10. Rule-based model of vein graft remodeling.

    Directory of Open Access Journals (Sweden)

    Minki Hwang

    Full Text Available When vein segments are implanted into the arterial system for use in arterial bypass grafting, adaptation to the higher pressure and flow of the arterial system is accomplished thorough wall thickening and expansion. These early remodeling events have been found to be closely coupled to the local hemodynamic forces, such as shear stress and wall tension, and are believed to be the foundation for later vein graft failure. To further our mechanistic understanding of the cellular and extracellular interactions that lead to global changes in tissue architecture, a rule-based modeling method is developed through the application of basic rules of behaviors for these molecular and cellular activities. In the current method, smooth muscle cell (SMC, extracellular matrix (ECM, and monocytes are selected as the three components that occupy the elements of a grid system that comprise the developing vein graft intima. The probabilities of the cellular behaviors are developed based on data extracted from in vivo experiments. At each time step, the various probabilities are computed and applied to the SMC and ECM elements to determine their next physical state and behavior. One- and two-dimensional models are developed to test and validate the computational approach. The importance of monocyte infiltration, and the associated effect in augmenting extracellular matrix deposition, was evaluated and found to be an important component in model development. Final model validation is performed using an independent set of experiments, where model predictions of intimal growth are evaluated against experimental data obtained from the complex geometry and shear stress patterns offered by a mid-graft focal stenosis, where simulation results show good agreements with the experimental data.

  11. Rule-Based and Information-Integration Category Learning in Normal Aging

    Science.gov (United States)

    Maddox, W. Todd; Pacheco, Jennifer; Reeves, Maia; Zhu, Bo; Schnyer, David M.

    2010-01-01

    The basal ganglia and prefrontal cortex play critical roles in category learning. Both regions evidence age-related structural and functional declines. The current study examined rule-based and information-integration category learning in a group of older and younger adults. Rule-based learning is thought to involve explicit, frontally mediated…

  12. STUDI DESAIN DOWN SCALE TERAS REAKTOR DAN BAHAN BAKAR PLTN JENIS PEBBLE BED MODULAR REACTOR – HTR 100 MWe

    Directory of Open Access Journals (Sweden)

    Slamet Parmanto

    2015-04-01

    Full Text Available Telah dilakukan penelitian terhadap teras reaktor Pebble Bed Modular Reactor (PBMR dengan daya 100 Mwe berbahan bakar UO2. Reaktor ini menggunakan moderator grafit dan helium sebagai pendingin. Studi down scale dilakukan tanpa mengubah geometri teras maupun geometri bahan bakar. Parameter yang dianalisis adalah kritikalitas teras, reaktivitas lebih, koefisien reaktivitas temperatur bahan bakar, moderator dan pendingin serta nilai ekonomis bahan bakar. Dari penelitian ini diharapkan diperoleh desain bahan bakar yang bernilai ekonomis dan memiliki fitur keselamatan melekat. Penelitian dilakukan dengan menggunakan program SRAC 2003. Hasil yang diperoleh adalah desain bahan bakar UO2 berbentuk pebble dengan pengkayaan 10% U235 dan 90 ppm racun dapat bakar Gd2O3. Nilai faktor multipilkasi effektif keff pada beginning of life (BOL adalah 1,01115 dan menjadi 1,00588 setelah 2658 hari operasi reaktor (EOL. Koefisien reaktivitas temperatur total diperoleh sebesar - 3,25900E-05 ∆k/k/K saat BOL dan -1,10615E-04 ∆k/k/K saat end of life (EOL. Reaktor ini memenuhi karakteristik keselamatan melekat ditandai dengan nilai koefisien reaktivitas temperatur yang negatif. Kata kunci: PBMR, desain bahan bakar, faktor multipilkasi effektif, reaktivitas lebih, koefisien reaktivitas temperatur.   Research of Pebble Bed Modular Reactor (PBMR 100 MWe which used UO2 fuel has been done. This reactor uses graphite as moderator and helium as coolant. Down scale studies performed without changing the core and fuel geometry. The parameter being analyzed were core criticality, excess reactivity, fuel, moderator, coolant temperature reactivity coefficient, and fuel economy. This research is expected to obtain the design that has fuel economy and inherent safety features. In this research, we have employed SRAC 2003 code. The calculation show that the UO2 pebble fuel design with 10% enrichment of U235 and 90 ppm burnable poison of Gd2O3 results in the effective multiplication

  13. Associations between rule-based parenting practices and child screen viewing: A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Joanna M. Kesten

    2015-01-01

    Conclusions: Limit setting is associated with greater SV. Collaborative rule setting may be effective for managing boys' game-console use. More research is needed to understand rule-based parenting practices.

  14. Rule-Based Analytic Asset Management for Space Exploration Systems (RAMSES) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Payload Systems Inc. (PSI) and the Massachusetts Institute of Technology (MIT) were selected to jointly develop the Rule-based Analytic Asset Management for Space...

  15. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  16. Rough set and rule-based multicriteria decision aiding

    Directory of Open Access Journals (Sweden)

    Roman Slowinski

    2012-08-01

    Full Text Available The aim of multicriteria decision aiding is to give the decision maker a recommendation concerning a set of objects evaluated from multiple points of view called criteria. Since a rational decision maker acts with respect to his/her value system, in order to recommend the most-preferred decision, one must identify decision maker's preferences. In this paper, we focus on preference discovery from data concerning some past decisions of the decision maker. We consider the preference model in the form of a set of "if..., then..." decision rules discovered from the data by inductive learning. To structure the data prior to induction of rules, we use the Dominance-based Rough Set Approach (DRSA. DRSA is a methodology for reasoning about data, which handles ordinal evaluations of objects on considered criteria and monotonic relationships between these evaluations and the decision. We review applications of DRSA to a large variety of multicriteria decision problems.

  17. A Fuzzy Rule-Based Expert System for Evaluating Intellectual Capital

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Fazel Zarandi

    2012-01-01

    Full Text Available A fuzzy rule-based expert system is developed for evaluating intellectual capital. A fuzzy linguistic approach assists managers to understand and evaluate the level of each intellectual capital item. The proposed fuzzy rule-based expert system applies fuzzy linguistic variables to express the level of qualitative evaluation and criteria of experts. Feasibility of the proposed model is demonstrated by the result of intellectual capital performance evaluation for a sample company.

  18. Genealogical Information Search by Using Parent Bidirectional Breadth Algorithm and Rule Based Relationship

    CERN Document Server

    Nuanmeesri, Sumitra; Meesad, Payung

    2010-01-01

    Genealogical information is the best histories resources for culture study and cultural heritage. The genealogical research generally presents family information and depict tree diagram. This paper presents Parent Bidirectional Breadth Algorithm (PBBA) to find consanguine relationship between two persons. In addition, the paper utilizes rules based system in order to identify consanguine relationship. The study reveals that PBBA is fast to solve the genealogical information search problem and the Rule Based Relationship provides more benefits in blood relationship identification.

  19. Generating Fuzzy Rule-based Systems from Examples Based on Robust Support Vector Machine

    Institute of Scientific and Technical Information of China (English)

    JIA Jiong; ZHANG Hao-ran

    2006-01-01

    This paper firstly proposes a new support vector machine regression (SVR) with a robust loss function, and designs a gradient based algorithm for implementation of the SVR,then uses the SVR to extract fuzzy rules and designs fuzzy rule-based system. Simulations show that fuzzy rule-based system technique based on robust SVR achieves superior performance to the conventional fuzzy inference method, the proposed method provides satisfactory performance with excellent approximation and generalization property than the existing algorithm.

  20. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  1. SEMICONDUCTOR INTEGRATED CIRCUITS: A low-jitter RF PLL frequency synthesizer with high-speed mixed-signal down-scaling circuits

    Science.gov (United States)

    Lu, Tang; Zhigong, Wang; Hong, Xue; Xiaohu, He; Yong, Xu; Ling, Sun

    2010-05-01

    A low-jitter RF phase locked loop (PLL) frequency synthesizer with high-speed mixed-signal down-scaling circuits is proposed. Several techniques are proposed to reduce the design complexity and improve the performance of the mixed-signal down-scaling circuit in the PLL. An improved D-latch is proposed to increase the speed and the driving capability of the DMP in the down-scaling circuit. Through integrating the D-latch with 'OR' logic for dual-modulus operation, the delays associated with both the 'OR' and D-flip-flop (DFF) operations are reduced, and the complexity of the circuit is also decreased. The programmable frequency divider of the down-scaling circuit is realized in a new method based on deep submicron CMOS technology standard cells and a more accurate wire-load model. The charge pump in the PLL is also realized with a novel architecture to improve the current matching characteristic so as to reduce the jitter of the system. The proposed RF PLL frequency synthesizer is realized with a TSMC 0.18-μm CMOS process. The measured phase noise of the PLL frequency synthesizer output at 100 kHz offset from the center frequency is only -101.52 dBc/Hz. The circuit exhibits a low RMS jitter of 3.3 ps. The power consumption of the PLL frequency synthesizer is also as low as 36 mW at a 1.8 V power supply.

  2. Differential Impact of Visuospatial Working Memory on Rule-based and Information-integration Category Learning.

    Science.gov (United States)

    Xing, Qiang; Sun, Hailong

    2017-01-01

    Previous studies have indicated that the category learning system is a mechanism with multiple processing systems, and that working memory has different effects on category learning. But how does visuospatial working memory affect perceptual category learning? As there is no definite answer to this question, we conducted three experiments. In Experiment 1, the dual-task paradigm with sequential presentation was adopted to investigate the influence of visuospatial working memory on rule-based and information-integration category learning. The results showed that visuospatial working memory interferes with rule-based but not information-integration category learning. In Experiment 2, the dual-task paradigm with simultaneous presentation was used, in which the categorization task was integrated into the visuospatial working memory task. The results indicated that visuospatial working memory affects information-integration category learning but not rule-based category learning. In Experiment 3, the dual-task paradigm with simultaneous presentation was employed, in which visuospatial working memory was integrated into the category learning task. The results revealed that visuospatial working memory interferes with both rule-based and information-integration category learning. Through these three experiments, we found that, regarding the rule-based category learning, working memory load is the main mechanism by which visuospatial working memory influences the discovery of the category rules. In addition, regarding the information-integration category learning, visual resources mainly operates on the category representation.

  3. Arabic Rule-Based Named Entity Recognition Systems Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Ramzi Esmail Salah

    2017-06-01

    Full Text Available Rule-based approaches are using human-made rules to extract Named Entities (NEs, it is one of the most famous ways to extract NE as well as Machine Learning.  The term Named Entity Recognition (NER is defined as a task determined to indicate personal names, locations, organizations and many other entities. In Arabic language, Big Data challenges make Arabic NER develops rapidly and extracts useful information from texts. The current paper sheds some light on research progress in rule-based via a diagnostic comparison among linguistic resource, entity type, domain, and performance. We also highlight the challenges of the processing Arabic NEs through rule-based systems. It is expected that good performance of NER will be effective to other modern fields like semantic web searching, question answering, machine translation, information retrieval, and abstracting systems.

  4. DEVELOP-FPS: a First Person Shooter Development Tool for Rule-based Scripts

    Directory of Open Access Journals (Sweden)

    Bruno Correia

    2012-09-01

    Full Text Available We present DEVELOP-FPS, a software tool specially designed for the development of First Person Shooter (FPS players controlled by Rule Based Scripts. DEVELOP-FPS may be used by FPS developers to create, debug, maintain and compare rule base player behaviours, providing a set of useful functionalities: i for an easy preparation of the right scenarios for game debugging and testing; ii for controlling the game execution: users can stop and resume the game execution at any instant, monitoring and controlling every player in the game, monitoring the state of each player, their rule base activation, being able to issue commands to control their behaviour; and iii to automatically run a certain number of game executions and collect data in order to evaluate and compare the players performance along a sufficient number of similar experiments.

  5. Interval Type-II Fuzzy Rule-Based STATCOM for Voltage Regulation in the Power System

    Directory of Open Access Journals (Sweden)

    Ying-Yi Hong

    2015-08-01

    Full Text Available The static synchronous compensator (STATCOM has recently received much attention owing to its ability to stabilize power systems and mitigate voltage variations. This paper investigates a novel interval type-II fuzzy rule-based PID (proportional-integral-derivative controller for the STATCOM to mitigate bus voltage variations caused by large changes in load and the intermittent generation of photovoltaic (PV arrays. The proposed interval type-II fuzzy rule base utilizes the output of the PID controller to tune the signal applied to the STATCOM. The rules involve upper and lower membership functions that ensure the stable responses of the controlled system. The proposed method is implemented using the NEPLAN software package and MATLAB/Simulink with co-simulation. A six-bus system is used to show the effectiveness of the proposed method. Comparative studies show that the proposed method is superior to traditional PID and type-I fuzzy rule-based methods.

  6. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  7. Fuzzy rule-based seizure prediction based on correlation dimension changes in intracranial EEG.

    Science.gov (United States)

    Rabbi, Ahmed F; Aarabi, Ardalan; Fazel-Rezai, Reza

    2010-01-01

    In this paper, we present a method for epileptic seizure prediction from intracranial EEG recordings. We applied correlation dimension, a nonlinear dynamics based univariate characteristic measure for extracting features from EEG segments. Finally, we designed a fuzzy rule-based system for seizure prediction. The system is primarily designed based on expert's knowledge and reasoning. A spatial-temporal filtering method was used in accordance with the fuzzy rule-based inference system for issuing forecasting alarms. The system was evaluated on EEG data from 10 patients having 15 seizures.

  8. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    Science.gov (United States)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  9. Segmentation-based and rule-based spectral mixture analysis for estimating urban imperviousness

    Science.gov (United States)

    Li, Miao; Zang, Shuying; Wu, Changshan; Deng, Yingbin

    2015-03-01

    For detailed estimation of urban imperviousness, numerous image processing methods have been developed, and applied to different urban areas with some success. Most of these methods, however, are global techniques. That is, they have been applied to the entire study area without considering spatial and contextual variations. To address this problem, this paper explores whether two spatio-contextual analysis techniques, namely segmentation-based and rule-based analysis, can improve urban imperviousness estimation. These two spatio-contextual techniques were incorporated to a classic urban imperviousness estimation technique, fully-constrained linear spectral mixture analysis (FCLSMA) method. In particular, image segmentation was applied to divide the image to homogenous segments, and spatially varying endmembers were chosen for each segment. Then an FCLSMA was applied for each segment to estimate the pixel-wise fractional coverage of high-albedo material, low-albedo material, vegetation, and soil. Finally, a rule-based analysis was carried out to estimate the percent impervious surface area (%ISA). The developed technique was applied to a Landsat TM image acquired in Milwaukee River Watershed, an urbanized watershed in Wisconsin, United States. Results indicate that the performance of the developed segmentation-based and rule-based LSMA (S-R-LSMA) outperforms traditional SMA techniques, with a mean average error (MAE) of 5.44% and R2 of 0.88. Further, a comparative analysis shows that, when compared to segmentation, rule-based analysis plays a more essential role in improving the estimation accuracy.

  10. Fuzzy rule-based macroinvertebrate habitat suitability models for running waters

    NARCIS (Netherlands)

    Broekhoven, Van E.; Adriaenssens, V.; Baets, De B.; Verdonschot, P.F.M.

    2006-01-01

    A fuzzy rule-based approach was applied to a macroinvertebrate habitat suitability modelling problem. The model design was based on a knowledge base summarising the preferences and tolerances of 86 macroinvertebrate species for four variables describing river sites in springs up to small rivers in t

  11. Rule-based category learning in children: the role of age and executive functioning.

    Science.gov (United States)

    Rabi, Rahel; Minda, John Paul

    2014-01-01

    Rule-based category learning was examined in 4-11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning.

  12. A Rule-based Track Anomaly Detection Algorithm for Maritime Force Protection

    Science.gov (United States)

    2014-08-01

    likely to perform better with AIS data than with primary radar data. Rule-based algorithms are transparent , easy to use, and use less computation...8039. 6. Kadar I., “ Perceptual Reasoning Managed Situation Assessment for harbour protection”, Springer Science 2009. 7. Jasenvicius R

  13. Applications of fuzzy sets to rule-based expert system development

    Science.gov (United States)

    Lea, Robert N.

    1989-01-01

    Problems of implementing rule-based expert systems using fuzzy sets are considered. A fuzzy logic software development shell is used that allows inclusion of both crisp and fuzzy rules in decision making and process control problems. Results are given that compare this type of expert system to a human expert in some specific applications. Advantages and disadvantages of such systems are discussed.

  14. Rule-based versus probabilistic selection for active surveillance using three definitions of insignificant prostate cancer

    NARCIS (Netherlands)

    L.D.F. Venderbos (Lionne); M.J. Roobol-Bouts (Monique); C.H. Bangma (Chris); R.C.N. van den Bergh (Roderick); L.P. Bokhorst (Leonard); D. Nieboer (Daan); Godtman, R; J. Hugosson (Jonas); van der Kwast, T; E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractTo study whether probabilistic selection by the use of a nomogram could improve patient selection for active surveillance (AS) compared to the various sets of rule-based AS inclusion criteria currently used. We studied Dutch and Swedish patients participating in the European Randomized s

  15. Development and Validation of a Rule-Based Strength Scaling Method for Musculoskeletal Modelling

    DEFF Research Database (Denmark)

    Oomen, Pieter; Annegarn, Janneke; Rasmussen, John

    2015-01-01

    Rule based strength scaling is an easy, cheap and relatively accurate technique to personalize musculoskeletal (MS) models. This paper presents a new strength scaling approach for MS models and validates it by maximal voluntary contractions (MVC). A heterogeneous group of 63 healthy subjects...

  16. Effects of Multimedia on Cognitive Load, Self-Efficacy, and Multiple Rule-Based Problem Solving

    Science.gov (United States)

    Zheng, Robert; McAlack, Matthew; Wilmes, Barbara; Kohler-Evans, Patty; Williamson, Jacquee

    2009-01-01

    This study investigates effects of multimedia on cognitive load, self-efficacy and learners' ability to solve multiple rule-based problems. Two hundred twenty-two college students were randomly assigned to interactive and non-interactive multimedia groups. Based on Engelkamp's multimodal theory, the present study investigates the role of…

  17. CT Image Sequence Analysis for Object Recognition - A Rule-Based 3-D Computer Vision System

    Science.gov (United States)

    Dongping Zhu; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman

    1991-01-01

    Research is now underway to create a vision system for hardwood log inspection using a knowledge-based approach. In this paper, we present a rule-based, 3-D vision system for locating and identifying wood defects using topological, geometric, and statistical attributes. A number of different features can be derived from the 3-D input scenes. These features and evidence...

  18. Haunted by a doppelgänger: irrelevant facial similarity affects rule-based judgments.

    Science.gov (United States)

    von Helversen, Bettina; Herzog, Stefan M; Rieskamp, Jörg

    2014-01-01

    Judging other people is a common and important task. Every day professionals make decisions that affect the lives of other people when they diagnose medical conditions, grant parole, or hire new employees. To prevent discrimination, professional standards require that decision makers render accurate and unbiased judgments solely based on relevant information. Facial similarity to previously encountered persons can be a potential source of bias. Psychological research suggests that people only rely on similarity-based judgment strategies if the provided information does not allow them to make accurate rule-based judgments. Our study shows, however, that facial similarity to previously encountered persons influences judgment even in situations in which relevant information is available for making accurate rule-based judgments and where similarity is irrelevant for the task and relying on similarity is detrimental. In two experiments in an employment context we show that applicants who looked similar to high-performing former employees were judged as more suitable than applicants who looked similar to low-performing former employees. This similarity effect was found despite the fact that the participants used the relevant résumé information about the applicants by following a rule-based judgment strategy. These findings suggest that similarity-based and rule-based processes simultaneously underlie human judgment.

  19. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data

    Directory of Open Access Journals (Sweden)

    Mitchell Pesesky

    2016-11-01

    factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.

  20. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data.

    Science.gov (United States)

    Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam

    2016-01-01

    incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.

  1. Spatial Queries Entity Recognition and Disambiguation Using Rule-Based Approach

    Science.gov (United States)

    Hamzei, E.; Hakimpour, F.; Forati, A.

    2015-12-01

    In the digital world, search engines have been proposed as one of challenging research areas. One of the main issues in search engines studies is query processing, which its aim is to understand user's needs. If unsuitable spatial query processing approach is employed, the results will be associated with high degree of ambiguity. To evade such degree of ambiguity, in this paper we present a new algorithm which depends on rule-based systems to process queries. Our algorithm is implemented in the three basic steps including: deductively iterative splitting the query; finding candidates for the location names, the location types and spatial relationships; and finally checking the relationships logically and conceptually using a rule based system. As we finally present in the paper using our proposed method have two major advantages: the search engines can provide the capability of spatial analysis based on the specific process and secondly because of its disambiguation technique, user reaches the more desirable result.

  2. Similarity and rules United: similarity- and rule-based processing in a single neural network.

    Science.gov (United States)

    Verguts, Tom; Fias, Wim

    2009-03-01

    A central controversy in cognitive science concerns the roles of rules versus similarity. To gain some leverage on this problem, we propose that rule- versus similarity-based processes can be characterized as extremes in a multidimensional space that is composed of at least two dimensions: the number of features (Pothos, 2005) and the physical presence of features. The transition of similarity- to rule-based processing is conceptualized as a transition in this space. To illustrate this, we show how a neural network model uses input features (and in this sense produces similarity-based responses) when it has a low learning rate or in the early phases of training, but it switches to using self-generated, more abstract features (and in this sense produces rule-based responses) when it has a higher learning rate or is in the later phases of training. Relations with categorization and the psychology of learning are pointed out.

  3. Changing from a Rules-based to a Principles-based Accounting Logic: A Review

    Directory of Open Access Journals (Sweden)

    Marta Silva Guerreiro

    2014-06-01

    Full Text Available We explore influences on unlisted companies when Portugal moved from a code law, rules-based accounting system, to a principles-based accounting system of adapted International Financial Reporting Standards (IFRS. Institutionalisation of the new principles-based system was generally facilitated by a socio-economic and political context that increasingly supported IFRS logic. This helped central actors gain political opportunity, mobilise important allies, and accommodate major protagonists. The preparedness of unlisted companies to adopt the new IFRS-based accounting system voluntarily was explained by their desire to maintain social legitimacy. However, it was affected negatively by the embeddedness of rule-based practices in the ‘old’ prevailing institutional logic.

  4. Knowledge representation and rule-based solution system for dynamic programming model

    Institute of Scientific and Technical Information of China (English)

    胡祥培; 王旭茵

    2003-01-01

    A knowledge representation has been proposed using the state-space theory of Artificial Intelligencefor Dynamic Programming Model, in which a model can be defined as a six-tuple M = (I,G,O,T,D,S). Abuilding block modeling method uses the modules of a six-tuple to form a rule-based solution model. Moreover,a rule-based system has been designed and set up to solve the Dynamic Programming Model. This knowledge-based representation can be easily used to express symbolical knowledge and dynamic characteristics for Dynam-ic Programming Model, and the inference based on the knowledge in the process of solving Dynamic Program-ming Model can also be conveniently realized in computer.

  5. A Belief Rule Based Expert System to Assess Tuberculosis under Uncertainty.

    Science.gov (United States)

    Hossain, Mohammad Shahadat; Ahmed, Faisal; Fatema-Tuj-Johora; Andersson, Karl

    2017-03-01

    The primary diagnosis of Tuberculosis (TB) is usually carried out by looking at the various signs and symptoms of a patient. However, these signs and symptoms cannot be measured with 100 % certainty since they are associated with various types of uncertainties such as vagueness, imprecision, randomness, ignorance and incompleteness. Consequently, traditional primary diagnosis, based on these signs and symptoms, which is carried out by the physicians, cannot deliver reliable results. Therefore, this article presents the design, development and applications of a Belief Rule Based Expert System (BRBES) with the ability to handle various types of uncertainties to diagnose TB. The knowledge base of this system is constructed by taking experts' suggestions and by analyzing historical data of TB patients. The experiments, carried out, by taking the data of 100 patients demonstrate that the BRBES's generated results are more reliable than that of human expert as well as fuzzy rule based expert system.

  6. A Belief Rule Based Expert System to Assess Mental Disorder under Uncertainty

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Afif Monrat, Ahmed; Hasan, Mamun;

    2016-01-01

    Mental disorder is a change of mental or behavioral pattern that causes sufferings and impairs the ability to function in ordinary life. In psychopathology, the assessment methods of mental disorder contain various types of uncertainties associated with signs and symptoms. This study identifies...... a method that addresses the issue of uncertainty in assessing mental disorder. The fuzzy logic knowledge representation schema can address uncertainty associated with linguistic terms including ambiguity, imprecision, and vagueness. However, fuzzy logic is incapable of addressing uncertainty due...... to ignorance, incompleteness, and randomness. So, a belief rule-based expert system (BRBES) has been designed and developed with the capability of handling the uncertainties mentioned. Evidential reasoning works as the inference engine and the belief rule base as the knowledge representation schema...

  7. Fifty years of computer analysis in chest imaging: rule-based, machine learning, deep learning.

    Science.gov (United States)

    van Ginneken, Bram

    2017-03-01

    Half a century ago, the term "computer-aided diagnosis" (CAD) was introduced in the scientific literature. Pulmonary imaging, with chest radiography and computed tomography, has always been one of the focus areas in this field. In this study, I describe how machine learning became the dominant technology for tackling CAD in the lungs, generally producing better results than do classical rule-based approaches, and how the field is now rapidly changing: in the last few years, we have seen how even better results can be obtained with deep learning. The key differences among rule-based processing, machine learning, and deep learning are summarized and illustrated for various applications of CAD in the chest.

  8. Clustering and rule-based classifications of chemical structures evaluated in the biological activity space.

    Science.gov (United States)

    Schuffenhauer, Ansgar; Brown, Nathan; Ertl, Peter; Jenkins, Jeremy L; Selzer, Paul; Hamon, Jacques

    2007-01-01

    Classification methods for data sets of molecules according to their chemical structure were evaluated for their biological relevance, including rule-based, scaffold-oriented classification methods and clustering based on molecular descriptors. Three data sets resulting from uniformly determined in vitro biological profiling experiments were classified according to their chemical structures, and the results were compared in a Pareto analysis with the number of classes and their average spread in the profile space as two concurrent objectives which were to be minimized. It has been found that no classification method is overall superior to all other studied methods, but there is a general trend that rule-based, scaffold-oriented methods are the better choice if classes with homogeneous biological activity are required, but a large number of clusters can be tolerated. On the other hand, clustering based on chemical fingerprints is superior if fewer and larger classes are required, and some loss of homogeneity in biological activity can be accepted.

  9. A rule based comprehensive approach for reconfiguration of electrical distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Jizhong [College of Electrical Engineering, Chongqing University (China)]|[AREVA T and D Inc., 10865 Willows Road NE, Redmond, WA 98052 (United States); Xiong, Xiaofu; Zhang, Jun [College of Electrical Engineering, Chongqing University (China); Shen, Guanquan; Xu, Qiuping; Xue, Yi [Guiyang South Power Supply Bureau, China Southern Power Grid (China)

    2009-02-15

    This paper proposes a rule based comprehensive approach to study distribution network reconfiguration (DNRC). The DNRC model with line power constraints is set up, in which the objective is to minimize the system power loss. In order to get the precise branch current and system power loss, a power summation based radiation distribution network load flow (PSRDNLF) method is applied in the study. The rules that are used to select the optimal reconfiguration of distribution network are formed based on the system operation experiences. The proposed rule based comprehensive approach is implemented in distribution network in Guiyang South Power Supply Bureau. For the purpose of illustrating the proposed approach, two distribution network systems are tested and analyzed in the paper. (author)

  10. Fuzzy rule-based models for decision support in ecosystem management.

    Science.gov (United States)

    Adriaenssens, Veronique; De Baets, Bernard; Goethals, Peter L M; De Pauw, Niels

    2004-02-05

    To facilitate decision support in the ecosystem management, ecological expertise and site-specific data need to be integrated. Fuzzy logic can deal with highly variable, linguistic, vague and uncertain data or knowledge and, therefore, has the ability to allow for a logical, reliable and transparent information stream from data collection down to data usage in decision-making. Several environmental applications already implicate the use of fuzzy logic. Most of these applications have been set up by trial and error and are mainly limited to the domain of environmental assessment. In this article, applications of fuzzy logic for decision support in ecosystem management are reviewed and assessed, with an emphasis on rule-based models. In particular, the identification, optimisation, validation, the interpretability and uncertainty aspects of fuzzy rule-based models for decision support in ecosystem management are discussed.

  11. Towards a framework for threaded inference in rule-based systems

    Directory of Open Access Journals (Sweden)

    Luis Casillas Santillan

    2013-11-01

    Full Text Available nformation and communication technologies have shown a significant advance and fast pace in their performance and pervasiveness. Knowledge has become a significant asset for organizations, which need to deal with large amounts of data and information to produce valuable knowledge. Dealing with knowledge is turning the axis for organizations in the new economy. One of the choices to gather the goal of knowledge managing is the use of rule-based systems. This kind of approach is the new chance for expert-systems’ technology. Modern languages and cheap computing allow the implementation of concurrent systems for dealing huge volumes of information in organizations. The present work is aimed at proposing the use of contemporary programming elements, as easy to exploit threading, when implementing rule-based treatment over huge data volumes.

  12. Auto-control of pumping operations in sewerage systems by rule-based fuzzy neural networks

    OpenAIRE

    Chiang, Y.-M.; Chang, L.-C.; Tsai, M.-J.; Wang, Y. -F.; Chang, F.-J.

    2011-01-01

    Pumping stations play an important role in flood mitigation in metropolitan areas. The existing sewerage systems, however, are facing a great challenge of fast rising peak flow resulting from urbanization and climate change. It is imperative to construct an efficient and accurate operating prediction model for pumping stations to simulate the drainage mechanism for discharging the rainwater in advance. In this study, we propose two rule-based fuzzy neural networks, adaptive neuro-fuzzy infere...

  13. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  14. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    Directory of Open Access Journals (Sweden)

    Jan Huwald

    2013-07-01

    Full Text Available A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models.

  15. Using reduced rule base with Expert System for the diagnosis of disease in hypertension.

    Science.gov (United States)

    Başçiftçi, Fatih; Eldem, Ayşe

    2013-12-01

    Hypertension, also called the "Silent Killer", is a dangerous and widespread disease that seriously threatens the health of individuals and communities worldwide, often leading to fatal outcomes such as heart attack, stroke, and renal failure. It affects approximately one billion people worldwide with increasing incidence. In Turkey, over 15 million people have hypertension. In this study, a new Medical Expert System (MES) procedure with reduced rule base was developed to determine hypertension. The aim was to determine the disease by taking all symptoms of hypertension into account in the Medical Expert System (7 symptoms, 2(7) = 128 different conditions). In this new MES procedure, instead of checking all the symptoms, the reduced rule bases were used. In order to get the reduced rule bases, the method of two-level simplification of Boolean functions was used. Through the use of this method, instead of assessing 2(7) = 128 individual conditions by taking 7 symptoms of hypertension into account, reduced cases were evaluated. The average rate of success was 97.6 % with the new MES procedure.

  16. Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems

    Science.gov (United States)

    Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith

    1988-01-01

    Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.

  17. An algebraic approach to revising propositional rule-based knowledge bases

    Institute of Scientific and Technical Information of China (English)

    LUAN ShangMin; DAI GuoZhong

    2008-01-01

    One of the important topics in knowledge base revision is to introduce an efficient implementation algorithm. Algebraic approaches have good characteristics and implementation method; they may be a choice to solve the problem. An algebraic approach is presented to revise propositional rule-based knowledge bases in this paper. A way is firstly introduced to transform a propositional rule-based knowl- edge base into a Petri net. A knowledge base is represented by a Petri net, and facts are represented by the initial marking. Thus, the consistency check of a knowledge base is equivalent to the reachability problem of Petri nets. The reachability of Petri nets can be decided by whether the state equation has a solution; hence the con- sistency check can also be implemented by algebraic approach. Furthermore, al- gorithms are introduced to revise a propositional rule-based knowledge base, as well as extended logic programming. Compared with related works, the algorithms presented in the paper are efficient, and the time complexities of these algorithms are polynomial.

  18. Functional network construction in Arabidopsis using rule-based machine learning on large-scale data sets.

    Science.gov (United States)

    Bassel, George W; Glaab, Enrico; Marquez, Julietta; Holdsworth, Michael J; Bacardit, Jaume

    2011-09-01

    The meta-analysis of large-scale postgenomics data sets within public databases promises to provide important novel biological knowledge. Statistical approaches including correlation analyses in coexpression studies of gene expression have emerged as tools to elucidate gene function using these data sets. Here, we present a powerful and novel alternative methodology to computationally identify functional relationships between genes from microarray data sets using rule-based machine learning. This approach, termed "coprediction," is based on the collective ability of groups of genes co-occurring within rules to accurately predict the developmental outcome of a biological system. We demonstrate the utility of coprediction as a powerful analytical tool using publicly available microarray data generated exclusively from Arabidopsis thaliana seeds to compute a functional gene interaction network, termed Seed Co-Prediction Network (SCoPNet). SCoPNet predicts functional associations between genes acting in the same developmental and signal transduction pathways irrespective of the similarity in their respective gene expression patterns. Using SCoPNet, we identified four novel regulators of seed germination (ALTERED SEED GERMINATION5, 6, 7, and 8), and predicted interactions at the level of transcript abundance between these novel and previously described factors influencing Arabidopsis seed germination. An online Web tool to query SCoPNet has been developed as a community resource to dissect seed biology and is available at http://www.vseed.nottingham.ac.uk/.

  19. Rule-based graph theory to enable exploration of the space system architecture design space

    Science.gov (United States)

    Arney, Dale Curtis

    network flow problems in the past, where nodes represent physical locations and edges represent the means by which information or vehicles travel between those locations. In space system architecting, expressing the physical locations (low-Earth orbit, low-lunar orbit, etc.) and steady states (interplanetary trajectory) as nodes and the different means of moving between the nodes (propulsive maneuvers, etc.) as edges formulates a mathematical representation of this design space. The selection of a given system architecture using graph theory entails defining the paths that the systems take through the space system architecture graph. A path through the graph is defined as a list of edges that are traversed, which in turn defines functions performed by the system. A structure to compactly represent this information is a matrix, called the system map, in which the column indices are associated with the systems that exist and row indices are associated with the edges, or functions, to which each system has access. Several contributions have been added to the state of the art in space system architecture analysis. The framework adds the capability to rapidly explore the design space without the need to limit trade options or the need for user interaction during the exploration process. The unique mathematical representation of a system architecture, through the use of the adjacency, incidence, and system map matrices, enables automated design space exploration using stochastic optimization processes. The innovative rule-based graph traversal algorithm ensures functional feasibility of each system architecture that is analyzed, and the automatic generation of the system hierarchy eliminates the need for the user to manually determine the relationships between systems during or before the design space exploration process. Finally, the rapid evaluation of system architectures for various mission types enables analysis of the system architecture design space for multiple

  20. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    Directory of Open Access Journals (Sweden)

    Justin S Hogg

    2014-04-01

    Full Text Available Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that

  1. Forecasting Peak Load Electricity Demand Using Statistics and Rule Based Approach

    Directory of Open Access Journals (Sweden)

    Z. Ismail

    2009-01-01

    Full Text Available Problem statement: Forecasting of electricity load demand is an essential activity and an important function in power system planning and development. It is a prerequisite to power system expansion planning as the world of electricity is dominated by substantial lead times between decision making and its implementation. The importance of demand forecasting needs to be emphasized at all level as the consequences of under or over forecasting the demand are serious and will affect all stakeholders in the electricity supply industry. Approach: If under estimated, the result is serious since plant installation cannot easily be advanced, this will affect the economy, business, loss of time and image. If over estimated, the financial penalty for excess capacity (i.e., over-estimated and wasting of resources. Therefore this study aimed to develop new forecasting model for forecasting electricity load demand which will minimize the error of forecasting. In this study, we explored the development of rule-based method for forecasting electricity peak load demand. The rule-based system synergized human reasoning style of fuzzy systems through the use of set of rules consisting of IF-THEN approximators with the learning and connectionist structure. Prior to the implementation of rule-based models, SARIMAT model and Regression time series were used. Results: Modification of the basic regression model and modeled it using Box-Jenkins auto regressive error had produced a satisfactory and adequate model with 2.41% forecasting error. With rule-based based forecasting, one can apply forecaster expertise and domain knowledge that is appropriate to the conditions of time series. Conclusion: This study showed a significant improvement in forecast accuracy when compared with the traditional time series model. Good domain knowledge of the experts had contributed to the increase in forecast accuracy. In general, the improvement will depend on the conditions of the data

  2. Compensatory Processing During Rule-Based Category Learning in Older Adults

    Science.gov (United States)

    Bharani, Krishna L.; Paller, Ken A.; Reber, Paul J.; Weintraub, Sandra; Yanar, Jorge; Morrison, Robert G.

    2016-01-01

    Healthy older adults typically perform worse than younger adults at rule-based category learning, but better than patients with Alzheimer's or Parkinson's disease. To further investigate aging's effect on rule-based category learning, we monitored event-related potentials (ERPs) while younger and neuropsychologically typical older adults performed a visual category-learning task with a rule-based category structure and trial-by-trial feedback. Using these procedures, we previously identified ERPs sensitive to categorization strategy and accuracy in young participants. In addition, previous studies have demonstrated the importance of neural processing in the prefrontal cortex and the medial temporal lobe for this task. In this study, older adults showed lower accuracy and longer response times than younger adults, but there were two distinct subgroups of older adults. One subgroup showed near-chance performance throughout the procedure, never categorizing accurately. The other subgroup reached asymptotic accuracy that was equivalent to that in younger adults, although they categorized more slowly. These two subgroups were further distinguished via ERPs. Consistent with the compensation theory of cognitive aging, older adults who successfully learned showed larger frontal ERPs when compared with younger adults. Recruitment of prefrontal resources may have improved performance while slowing response times. Additionally, correlations of feedback-locked P300 amplitudes with category-learning accuracy differentiated successful younger and older adults. Overall, the results suggest that the ability to adapt one's behavior in response to feedback during learning varies across older individuals, and that the failure of some to adapt their behavior may reflect inadequate engagement of prefrontal cortex. PMID:26422522

  3. Evaluation of a rule base for decision making in general practice.

    Science.gov (United States)

    Essex, B; Healy, M

    1994-01-01

    BACKGROUND. Decision making in general practice relies heavily on judgmental expertise. It should be possible to codify this expertise into rules and principles. AIM. A study was undertaken to evaluate the effectiveness, of rules from a rule base designed to improve students' and trainees' management decisions relating to patients seen in general practice. METHOD. The rule base was developed after studying decisions about and management of thousands of patients seen in one general practice over an eight year period. Vignettes were presented to 93 fourth year medical students and 179 general practitioner trainees. They recorded their perception and management of each case before and after being presented with a selection of relevant rules. Participants also commented on their level of agreement with each of the rules provided with the vignettes. A panel of five independent assessors then rated as good, acceptable or poor, the participants' perception and management of each case before and after seeing the rules. RESULTS. Exposure to a few selected rules of thumb improved the problem perception and management decisions of both undergraduates and trainees. The degree of improvement was not related to previous experience or to the stated level of agreement with the proposed rules. The assessors identified difficulties students and trainees experienced in changing their perceptions and management decisions when the rules suggested options they had not considered. CONCLUSION. The rules developed to improve decision making skills in general practice are effective when used with vignettes. The next phase is to transform the rule base into an expert system to train students and doctors to acquire decision making skills. It could also be used to provide decision support when confronted with difficult management decisions in general practice. PMID:8204334

  4. Rule Based Approach for Arabic Part of Speech Tagging and Name Entity Recognition

    Directory of Open Access Journals (Sweden)

    Mohammad Hjouj Btoush

    2016-06-01

    Full Text Available The aim of this study is to build a tool for Part of Speech (POS tagging and Name Entity Recognition for Arabic Language, the approach used to build this tool is a rule base technique. The POS Tagger contains two phases:The first phase is to pass word into a lexicon phase, the second level is the morphological phase, and the tagset are (Noun, Verb and Determine. The Named-Entity detector will apply rules on the text and give the correct Labels for each word, the labels are Person(PERS, Location (LOC and Organization (ORG.

  5. HIV-GRADE: a publicly available, rules-based drug resistance interpretation algorithm integrating bioinformatic knowledge.

    Science.gov (United States)

    Obermeier, Martin; Pironti, Alejandro; Berg, Thomas; Braun, Patrick; Däumer, Martin; Eberle, Josef; Ehret, Robert; Kaiser, Rolf; Kleinkauf, Niels; Korn, Klaus; Kücherer, Claudia; Müller, Harm; Noah, Christian; Stürmer, Martin; Thielen, Alexander; Wolf, Eva; Walter, Hauke

    2012-01-01

    Genotypic drug resistance testing provides essential information for guiding treatment in HIV-infected patients. It may either be used for identifying patients with transmitted drug resistance or to clarify reasons for treatment failure and to check for remaining treatment options. While different approaches for the interpretation of HIV sequence information are already available, no other available rules-based systems specifically have looked into the effects of combinations of drugs. HIV-GRADE (Genotypischer Resistenz Algorithmus Deutschland) was planned as a countrywide approach to establish standardized drug resistance interpretation in Germany and also to introduce rules for estimating the influence of mutations on drug combinations. The rules for HIV-GRADE are taken from the literature, clinical follow-up data and from a bioinformatics-driven interpretation system (geno2pheno([resistance])). HIV-GRADE presents the option of seeing the rules and results of other drug resistance algorithms for a given sequence simultaneously. The HIV-GRADE rules-based interpretation system was developed by the members of the HIV-GRADE registered society. For continuous updates, this expert committee meets twice a year to analyze data from various sources. Besides data from clinical studies and the centers involved, published correlations for mutations with drug resistance and genotype-phenotype correlation data information from the bioinformatic models of geno2pheno are used to generate the rules for the HIV-GRADE interpretation system. A freely available online tool was developed on the basis of the Stanford HIVdb rules interpretation tool using the algorithm specification interface. Clinical validation of the interpretation system was performed on the data of treatment episodes consisting of sequence information, antiretroviral treatment and viral load, before and 3 months after treatment change. Data were analyzed using multiple linear regression. As the developed online

  6. Ruled-based control of off-grid desalination powered by renewable energies

    Directory of Open Access Journals (Sweden)

    Alvaro Serna

    2015-08-01

    Full Text Available A rule-based control is presented for desalination plants operating under variable, renewable power availability. This control algorithm is based on two sets of rules: first, a list that prioritizes the reverse osmosis (RO units of the plant is created, based on the current state and the expected water demand; secondly, the available energy is then dispatched to these units following this prioritized list. The selected strategy is tested on a specific case study: a reverse osmosis plant designed for the production of desalinated water powered by wind and wave energy. Simulation results illustrate the correct performance of the plant under this control.

  7. Introducing the new GRASS module g.infer for data-driven rule-based applications

    Directory of Open Access Journals (Sweden)

    Peter Löwe

    2012-10-01

    Full Text Available This paper introduces the new GRASS GIS add-on module g.infer. The module enables rule-based analysis and workflow management in GRASS GIS, via data-driven inference processes based on the expert system shell CLIPS. The paper discusses the theoretical and developmental background that will help prepare the reader to use the module for Knowledge Engineering applications. In addition, potential application scenarios are sketched out, ranging from the rule-driven formulation of nontrivial GIS-classification tasks and GIS workflows to ontology management and intelligent software agents.

  8. Rule-based query answering method for a knowledge base of economic crimes

    CERN Document Server

    Bak, Jaroslaw

    2011-01-01

    We present a description of the PhD thesis which aims to propose a rule-based query answering method for relational data. In this approach we use an additional knowledge which is represented as a set of rules and describes the source data at concept (ontological) level. Queries are posed in the terms of abstract level. We present two methods. The first one uses hybrid reasoning and the second one exploits only forward chaining. These two methods are demonstrated by the prototypical implementation of the system coupled with the Jess engine. Tests are performed on the knowledge base of the selected economic crimes: fraudulent disbursement and money laundering.

  9. Rule-based medical device adaptation for the digital operating room.

    Science.gov (United States)

    Franke, Stefan; Neumuth, Thomas

    2015-08-01

    A workflow-driven cooperative operating room needs to be established in order to successfully unburden the surgeon and the operating room staff very time-consuming information-seeking and configuration tasks. We propose an approach towards the integration of intraoperative surgical workflow management and integration technologies. The concept of rule-based behavior is adapted to situation-aware medical devices. A prototype was implemented and experiments with sixty recorded brain tumor removal procedures were conducted to test the proposed approach. An analysis of the recordings indicated numerous applications, such as automatic display configuration, room light adaptation and pre-configuration of medical devices and systems.

  10. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  11. A Linguistic Evaluation of Rule-Based, Phrase-Based, and Neural MT Engines

    Directory of Open Access Journals (Sweden)

    Burchardt Aljoscha

    2017-06-01

    Full Text Available In this paper, we report an analysis of the strengths and weaknesses of several Machine Translation (MT engines implementing the three most widely used paradigms. The analysis is based on a manually built test suite that comprises a large range of linguistic phenomena. Two main observations are on the one hand the striking improvement of an commercial online system when turning from a phrase-based to a neural engine and on the other hand that the successful translations of neural MT systems sometimes bear resemblance with the translations of a rule-based MT system.

  12. Increasing Supply-Chain Visibility with Rule-Based RFID Data Analysis

    DEFF Research Database (Denmark)

    Ilic, A.; Andersen, Thomas; Michahelles, F.

    2009-01-01

    , the Supply Chain Visualizer increases supply-chain visibility by analyzing RFID data, using a mix of automated analysis techniques and human effort. The tool's core concepts include rule-based analysis techniques and a map-based representation interface. With these features, it lets users visualize......RFID technology tracks the flow of physical items and goods in supply chains to help users detect inefficiencies, such as shipment delays, theft, or inventory problems. An inevitable consequence, however, is that it generates huge numbers of events. To exploit these large amounts of data...

  13. The relevance of a rules-based maize marketing policy: an experimental case study of Zambia.

    Science.gov (United States)

    Abbink, Klaus; Jayne, Thomas S; Moller, Lars C

    2011-01-01

    Strategic interaction between public and private actors is increasingly recognised as an important determinant of agricultural market performance in Africa and elsewhere. Trust and consultation tends to positively affect private activity while uncertainty of government behaviour impedes it. This paper reports on a laboratory experiment based on a stylised model of the Zambian maize market. The experiment facilitates a comparison between discretionary interventionism and a rules-based policy in which the government pre-commits itself to a future course of action. A simple precommitment rule can, in theory, overcome the prevailing strategic dilemma by encouraging private sector participation. Although this result is also borne out in the economic experiment, the improvement in private sector activity is surprisingly small and not statistically significant due to irrationally cautious choices by experimental governments. Encouragingly, a rules-based policy promotes a much more stable market outcome thereby substantially reducing the risk of severe food shortages. These results underscore the importance of predictable and transparent rules for the state's involvement in agricultural markets.

  14. A Rule-Based Data Transfer Protocol for On-Demand Data Exchange in Vehicular Environment

    Directory of Open Access Journals (Sweden)

    Liao Hsien-Chou

    2009-01-01

    Full Text Available The purpose of Intelligent Transport System (ITS is mainly to increase the driving safety and efficiency. Data exchange is an important way to achieve the purpose. An on-demand data exchange is especially useful to assist a driver avoiding some emergent events. In order to handle the data exchange under dynamic situations, a rule-based data transfer protocol is proposed in this paper. A set of rules is designed according to the principle of request-forward-reply (RFR. That is, they are used to determine the timing of data broadcasting, forwarding, and replying automatically. Two typical situations are used to demonstrate the operation of rules. One is the front view of a driver occluded by other vehicles. The other is the traffic jam. The proposed protocol is flexible and extensible for unforeseen situations. Three simulation tools were also implemented to demonstrate the feasibility of the protocol and measure the network transmission under high density of vehicles. The simulation results show that the rule-based protocol is efficient on data exchange to increase the driving safety.

  15. Automated Simulation P2P Botnets Signature Detection by Rule-based Approach

    Directory of Open Access Journals (Sweden)

    Raihana Syahirah Abdullah

    2016-08-01

    Full Text Available Internet is a most salient services in communication. Thus, companies take this opportunity by putting critical resources online for effective business organization. This has given rise to activities of cyber criminals actuated by botnets. P2P networks had gained popularity through distributed applications such as file-sharing, web caching and network storage whereby it is not easy to guarantee that the file exchanged not the malicious in non-centralized authority of P2P networks. For this reason, these networks become the suitable venue for malicious software to spread. It is straightforward for attackers to target the vulnerable hosts in existing P2P networks as bot candidates and build their zombie army. They can be used to compromise a host and make it become a P2P bot. In order to detect these botnets, a complete flow analysis is necessary. In this paper, we proposed an automated P2P botnets through rule-based detection approach which currently focuses on P2P signature illumination. We consider both of synchronisation within a botnets and the malicious behaviour each bot exhibits at the host or network level to recognize the signature and activities in P2P botnets traffic. The rule-based approach have high detection accuracy and low false positive.

  16. Automated detection of pain from facial expressions: a rule-based approach using AAM

    Science.gov (United States)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  17. AN QUALITY BASED ENHANCEMENT OF USER DATA PROTECTION VIA FUZZY RULE BASED SYSTEMS IN CLOUD ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R Poorva Devi

    2016-04-01

    Full Text Available So far, in cloud computing distinct customer is accessed and consumed enormous amount of services through web, offered by cloud service provider (CSP. However cloud is providing one of the services is, security-as-a-service to its clients, still people are terrified to use the service from cloud vendor. Number of solutions, security components and measurements are coming with the new scope for the cloud security issue, but 79.2% security outcome only obtained from the different scientists, researchers and other cloud based academy community. To overcome the problem of cloud security the proposed model that is, “Quality based Enhancing the user data protection via fuzzy rule based systems in cloud environment”, will helps to the cloud clients by the way of accessing the cloud resources through remote monitoring management (RMMM and what are all the services are currently requesting and consuming by the cloud users that can be well analyzed with Managed service provider (MSP rather than a traditional CSP. Normally, people are trying to secure their own private data by applying some key management and cryptographic based computations again it will direct to the security problem. In order to provide good quality of security target result by making use of fuzzy rule based systems (Constraint & Conclusion segments in cloud environment. By using this technique, users may obtain an efficient security outcome through the cloud simulation tool of Apache cloud stack simulator.

  18. A rule based fuzzy model for the prediction of petrophysical rock parameters

    Energy Technology Data Exchange (ETDEWEB)

    Finol, J.; Jing, X.D. [T.H. Huxley School of Environment, Earth Sciences and Engineering, Imperial College, Prince Consort Road, SW7 2BP London (United Kingdom); Ke Guo, Y. [Fujitsu Parallel Computing Centre, Department of Computing, Imperial College, SW7 2BZ London (United Kingdom)

    2001-04-01

    A new approach for the prediction of petrophysical rock parameters based on a rule-based fuzzy model is presented. The rule-based fuzzy model corresponds to the Takagi-Sugeno-Kang method of fuzzy reasoning proposed by Sugeno and his co-authors. This fuzzy model is defined by a set of fuzzy implications with linear consequent parts, each of which establishes a local linear input-output relationship between the variables of the model. In this approach, a fuzzy clustering algorithm is combined with the least-square approximation method to identify the structure and parameters of the fuzzy model from sets of numerical data. To verify the effectiveness of the proposed fuzzy modeling method, two examples are developed using core and electrical log data from three oil wells in Ceuta Field, Lake Maracaibo Basin. The numerical results of the fuzzy modelling method are compared with the results of a conventional linear regression model. It is shown that the fuzzy modeling approach is not only more accurate than the conventional regression approach but also provides some qualitative information about the underlying complexities of the porous system.

  19. RB-ARD: A proof of concept rule-based abort

    Science.gov (United States)

    Smith, Richard; Marinuzzi, John

    1987-01-01

    The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.

  20. A multilayer perceptron solution to the match phase problem in rule-based artificial intelligence systems

    Science.gov (United States)

    Sartori, Michael A.; Passino, Kevin M.; Antsaklis, Panos J.

    1992-01-01

    In rule-based AI planning, expert, and learning systems, it is often the case that the left-hand-sides of the rules must be repeatedly compared to the contents of some 'working memory'. The traditional approach to solve such a 'match phase problem' for production systems is to use the Rete Match Algorithm. Here, a new technique using a multilayer perceptron, a particular artificial neural network model, is presented to solve the match phase problem for rule-based AI systems. A syntax for premise formulas (i.e., the left-hand-sides of the rules) is defined, and working memory is specified. From this, it is shown how to construct a multilayer perceptron that finds all of the rules which can be executed for the current situation in working memory. The complexity of the constructed multilayer perceptron is derived in terms of the maximum number of nodes and the required number of layers. A method for reducing the number of layers to at most three is also presented.

  1. A Rule-Based Model for Bankruptcy Prediction Based on an Improved Genetic Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2013-01-01

    Full Text Available In this paper, we proposed a hybrid system to predict corporate bankruptcy. The whole procedure consists of the following four stages: first, sequential forward selection was used to extract the most important features; second, a rule-based model was chosen to fit the given dataset since it can present physical meaning; third, a genetic ant colony algorithm (GACA was introduced; the fitness scaling strategy and the chaotic operator were incorporated with GACA, forming a new algorithm—fitness-scaling chaotic GACA (FSCGACA, which was used to seek the optimal parameters of the rule-based model; and finally, the stratified K-fold cross-validation technique was used to enhance the generalization of the model. Simulation experiments of 1000 corporations’ data collected from 2006 to 2009 demonstrated that the proposed model was effective. It selected the 5 most important factors as “net income to stock broker’s equality,” “quick ratio,” “retained earnings to total assets,” “stockholders’ equity to total assets,” and “financial expenses to sales.” The total misclassification error of the proposed FSCGACA was only 7.9%, exceeding the results of genetic algorithm (GA, ant colony algorithm (ACA, and GACA. The average computation time of the model is 2.02 s.

  2. Feedback can be superior to observational training for both rule-based and information-integration category structures.

    Science.gov (United States)

    Edmunds, C E R; Milton, Fraser; Wills, Andy J

    2015-01-01

    The effects of two different types of training on rule-based and information-integration category learning were investigated in two experiments. In observational training, a category label is presented, followed by an example of that category and the participant's response. In feedback training, the stimulus is presented, and the participant assigns it to a category and then receives feedback about the accuracy of that decision. Ashby, Maddox, and Bohil (2002. Observational versus feedback training in rule-based and information-integration category learning. Memory & Cognition, 30, 666-677) reported that feedback training was superior to observational training when learning information-integration category structures, but that training type had little effect on the acquisition of rule-based category structures. These results were argued to support the COVIS (competition between verbal and implicit systems) dual-process account of category learning. However, a number of nonessential differences between their rule-based and information-integration conditions complicate interpretation of these findings. Experiment 1 controlled between-category structures for participant error rates, category separation, and the number of stimulus dimensions relevant to the categorization. Under these more controlled conditions, rule-based and information-integration category structures both benefited from feedback training to a similar degree. Experiment 2 maintained this difference in training type when learning a rule-based category that had otherwise been matched, in terms of category overlap and overall performance, with the rule-based categories used in Ashby et al. These results indicate that differences in dimensionality between the category structures in Ashby et al. is a more likely explanation for the interaction between training type and category structure than the dual-system explanation that they offered.

  3. Choosing goals, not rules: deciding among rule-based action plans.

    Science.gov (United States)

    Klaes, Christian; Westendorff, Stephanie; Chakrabarti, Shubhodeep; Gail, Alexander

    2011-05-12

    In natural situations, movements are often directed toward locations different from that of the evoking sensory stimulus. Movement goals must then be inferred from the sensory cue based on rules. When there is uncertainty about the rule that applies for a given cue, planning a movement involves both choosing the relevant rule and computing the movement goal based on that rule. Under these conditions, it is not clear whether primates compute multiple movement goals based on all possible rules before choosing an action, or whether they first choose a rule and then only represent the movement goal associated with that rule. Supporting the former hypothesis, we show that neurons in the frontoparietal reach areas of monkeys simultaneously represent two different rule-based movement goals, which are biased by the monkeys' choice preferences. Apparently, primates choose between multiple behavioral options by weighing against each other the movement goals associated with each option.

  4. ChemTok: A New Rule Based Tokenizer for Chemical Named Entity Recognition

    Directory of Open Access Journals (Sweden)

    Abbas Akkasi

    2016-01-01

    Full Text Available Named Entity Recognition (NER from text constitutes the first step in many text mining applications. The most important preliminary step for NER systems using machine learning approaches is tokenization where raw text is segmented into tokens. This study proposes an enhanced rule based tokenizer, ChemTok, which utilizes rules extracted mainly from the train data set. The main novelty of ChemTok is the use of the extracted rules in order to merge the tokens split in the previous steps, thus producing longer and more discriminative tokens. ChemTok is compared to the tokenization methods utilized by ChemSpot and tmChem. Support Vector Machines and Conditional Random Fields are employed as the learning algorithms. The experimental results show that the classifiers trained on the output of ChemTok outperforms all classifiers trained on the output of the other two tokenizers in terms of classification performance, and the number of incorrectly segmented entities.

  5. Fuzzy Rule-based Analysis of Promotional Efficiency in Vietnam’s Tourism Industry

    Directory of Open Access Journals (Sweden)

    Nguyen Quang VINH

    2015-06-01

    Full Text Available This study aims to determine an effective method of measuring the efficiency of promotional strategies for tourist destinations. Complicating factors that influence promotional efficiency (PE, such as promotional activities (PA, destination attribute (DA, and destination image (DI, make it difficult to evaluate the effectiveness of PE. This study develops a rule-based decision support mechanism using fuzzy set theory and the Analytic Hierarchy Process (AHP to evaluate the effectiveness of promotional strategies. Additionally, a statistical analysis is conducted using SPSS (Statistics Package for Social Science to confirm the results of the fuzzy AHP analysis. This study finds that government policy is the most important factor for PE and that service staff (internal beauty is more important than tourism infrastructure (external beauty in terms of customer satisfaction and long-term strategy in PE. With respect to DI, experts are concerned first with tourist perceived value, second with tourist satisfaction and finally with tourist loyalty.

  6. A General Attribute and Rule Based Role-Based Access Control Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Growing numbers of users and many access control policies which involve many different resource attributes in service-oriented environments bring various problems in protecting resource. This paper analyzes the relationships of resource attributes to user attributes in all policies, and propose a general attribute and rule based role-based access control(GAR-RBAC) model to meet the security needs. The model can dynamically assign users to roles via rules to meet the need of growing numbers of users. These rules use different attribute expression and permission as a part of authorization constraints, and are defined by analyzing relations of resource attributes to user attributes in many access policies that are defined by the enterprise. The model is a general access control model, and can support many access control policies, and also can be used to wider application for service. The paper also describes how to use the GAR-RBAC model in Web service environments.

  7. A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes

    Science.gov (United States)

    Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria

    In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.

  8. Multi-machine power system stabilizer design by rule based bacteria foraging

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, S.; Tripathy, M.; Nanda, J. [Department of Electrical Engineering, Indian Institute of Technology, Delhi (India)

    2007-10-15

    Several power system stabilizers (PSS) connected in number of machines in a multi-machine power systems, pose the problem of appropriate tuning of their parameters so that overall system dynamic stability can be improved in a robust way. Based on the foraging behavior of Escherichia coli bacteria in human intestine, this paper attempts to optimize simultaneously three constants each of several PSS present in a multi-machine power system. The tuning is done taking an objective function that incorporates a multi-operative condition, consisting of nominal and various changed conditions, into it. The convergence with the proposed rule based bacteria foraging (RBBF) optimization technique is superior to the conventional and genetic algorithm (GA) techniques. Robustness of tuning with the proposed method was verified, with transient stability analysis of the system by time domain simulations subjecting the power system to different types of disturbances. (author)

  9. Modeling for (physical) biologists: an introduction to the rule-based approach.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Faeder, James R; Hlavacek, William S

    2015-07-16

    Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions.

  10. A fuzzy rule based genetic algorithm and its application in FMS

    Institute of Scientific and Technical Information of China (English)

    Li Shugang; Wu Zhiming; Pang Xiaohong

    2005-01-01

    Most of the FMS (flexible manufacturing systems) problems belong to NP-hard (non-polynomial hard) problems. The facility layout problem and job-shop schedule problem are such examples. GA (genetic algorithm) is applied to get an optimal solution. However, traditional GAs are usually of low efficiency because of their early convergence. In order to overcome the shortcoming of the GA a fuzzy rule based GA is proposed, in which a fuzzy logical controller is introduced to adjust the value of crossover probability, mutation probability and crossover length. The HGA (hybrid genetic algorithm), which is integrated with a fuzzy logic controller, can avoid premature convergence, and improve the efficiency greatly. Finally, simulation results of the facility layout problem and job-shop schedule problem are given. The results show that the new genetic algorithm integrated with fuzzy logic controller is excellent in searching efficiency.

  11. The diagnosis of microcytic anemia by a rule-based expert system using VP-Expert.

    Science.gov (United States)

    O'Connor, M L; McKinney, T

    1989-09-01

    We describe our experience in creating a rule-based expert system for the interpretation of microcytic anemia using the expert system development tool, VP-Expert, running on an IBM personal computer. VP-Expert processes data (complete blood cell count results, age, and sex) according to a set of user-written logic rules (our program) to reach conclusions as to the following causes of microcytic anemia: alpha- and beta-thalassemia trait, iron deficiency, and anemia of chronic disease. Our expert system was tested using previously interpreted complete blood cell count data. In most instances, there was good agreement between the expert system and its pathologist-author, but many discrepancies were found in the interpretation of anemia of chronic disease. We conclude that VP-Expert has a useful level of power and flexibility, yet is simple enough that individuals with modest programming experience can create their own expert systems. Limitations of such expert systems are discussed.

  12. Fuzzy-Rule-Based Approach for Modeling Sensory Acceptabitity of Food Products

    Directory of Open Access Journals (Sweden)

    Olusegun Folorunso

    2009-04-01

    Full Text Available The prediction of product acceptability is often an additive effect of individual fuzzy impressions developed by a consumer on certain underlying attributes characteristic of the product. In this paper, we present the development of a data-driven fuzzy-rule-based approach for predicting the overall sensory acceptability of food products, in this case composite cassava-wheat bread. The model was formulated using the Takagi-Sugeno and Kang (TSK fuzzy modeling approach. Experiments with the model derived from sampled data were simulated on Windows 2000XP running on Intel 2Gh environment. The fuzzy membership function for the sensory scores is implemented in MATLAB 6.0 using the fuzzy logic toolkit, and weights of each linguistic attribute were obtained using a Correlation Coefficient formula. The results obtained are compared to those of human judgments. Overall assessments suggest that, if implemented, this approach will facilitate a better acceptability of cassava bread as well as nutritionally improved food.

  13. Modeling for (physical) biologists: an introduction to the rule-based approach

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Faeder, James R.; Hlavacek, William S.

    2015-07-01

    Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions.

  14. Auto-control of pumping operations in sewerage systems by rule-based fuzzy neural networks

    Science.gov (United States)

    Chiang, Y.-M.; Chang, L.-C.; Tsai, M.-J.; Wang, Y.-F.; Chang, F.-J.

    2011-01-01

    Pumping stations play an important role in flood mitigation in metropolitan areas. The existing sewerage systems, however, are facing a great challenge of fast rising peak flow resulting from urbanization and climate change. It is imperative to construct an efficient and accurate operating prediction model for pumping stations to simulate the drainage mechanism for discharging the rainwater in advance. In this study, we propose two rule-based fuzzy neural networks, adaptive neuro-fuzzy inference system (ANFIS) and counterpropagation fuzzy neural network for on-line predicting of the number of open and closed pumps of a pivotal pumping station in Taipei city up to a lead time of 20 min. The performance of ANFIS outperforms that of CFNN in terms of model efficiency, accuracy, and correctness. Furthermore, the results not only show the predictive water levels do contribute to the successfully operating pumping stations but also demonstrate the applicability and reliability of ANFIS in automatically controlling the urban sewerage systems.

  15. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    Science.gov (United States)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  16. Rule-Based vs. Behavior-Based Self-Deployment for Mobile Wireless Sensor Networks

    Science.gov (United States)

    Urdiales, Cristina; Aguilera, Francisco; González-Parada, Eva; Cano-García, Jose; Sandoval, Francisco

    2016-01-01

    In mobile wireless sensor networks (MWSN), nodes are allowed to move autonomously for deployment. This process is meant: (i) to achieve good coverage; and (ii) to distribute the communication load as homogeneously as possible. Rather than optimizing deployment, reactive algorithms are based on a set of rules or behaviors, so nodes can determine when to move. This paper presents an experimental evaluation of both reactive deployment approaches: rule-based and behavior-based ones. Specifically, we compare a backbone dispersion algorithm with a social potential fields algorithm. Most tests are done under simulation for a large number of nodes in environments with and without obstacles. Results are validated using a small robot network in the real world. Our results show that behavior-based deployment tends to provide better coverage and communication balance, especially for a large number of nodes in areas with obstacles. PMID:27399709

  17. Implementation of Rule Based Algorithm for Sandhi-Vicheda Of Compound Hindi Words

    CERN Document Server

    Gupta, Priyanka

    2009-01-01

    Sandhi means to join two or more words to coin new word. Sandhi literally means `putting together' or combining (of sounds), It denotes all combinatory sound-changes effected (spontaneously) for ease of pronunciation. Sandhi-vicheda describes [5] the process by which one letter (whether single or cojoined) is broken to form two words. Part of the broken letter remains as the last letter of the first word and part of the letter forms the first letter of the next letter. Sandhi- Vicheda is an easy and interesting way that can give entirely new dimension that add new way to traditional approach to Hindi Teaching. In this paper using the Rule based algorithm we have reported an accuracy of 60-80% depending upon the number of rules to be implemented.

  18. iTriplet, a rule-based nucleic acid sequence motif finder

    Directory of Open Access Journals (Sweden)

    Gunderson Samuel I

    2009-10-01

    Full Text Available Abstract Background With the advent of high throughput sequencing techniques, large amounts of sequencing data are readily available for analysis. Natural biological signals are intrinsically highly variable making their complete identification a computationally challenging problem. Many attempts in using statistical or combinatorial approaches have been made with great success in the past. However, identifying highly degenerate and long (>20 nucleotides motifs still remains an unmet challenge as high degeneracy will diminish statistical significance of biological signals and increasing motif size will cause combinatorial explosion. In this report, we present a novel rule-based method that is focused on finding degenerate and long motifs. Our proposed method, named iTriplet, avoids costly enumeration present in existing combinatorial methods and is amenable to parallel processing. Results We have conducted a comprehensive assessment on the performance and sensitivity-specificity of iTriplet in analyzing artificial and real biological sequences in various genomic regions. The results show that iTriplet is able to solve challenging cases. Furthermore we have confirmed the utility of iTriplet by showing it accurately predicts polyA-site-related motifs using a dual Luciferase reporter assay. Conclusion iTriplet is a novel rule-based combinatorial or enumerative motif finding method that is able to process highly degenerate and long motifs that have resisted analysis by other methods. In addition, iTriplet is distinguished from other methods of the same family by its parallelizability, which allows it to leverage the power of today's readily available high-performance computing systems.

  19. The effects of age on associative and rule-based causal learning and generalization.

    Science.gov (United States)

    Mutter, Sharon A; Plumlee, Leslie F

    2014-06-01

    We assessed how age influences associative and rule-based processes in causal learning using the Shanks and Darby (1998) concurrent patterning discrimination task. In Experiment 1, participants were divided into groups based on their learning performance after 6 blocks of training trials. High discrimination mastery young adults learned the patterning discrimination more rapidly and accurately than moderate mastery young adults. They were also more likely to induce the patterning rule and use this rule to generate predictions for novel cues, whereas moderate mastery young adults were more likely to use cue similarity as the basis for their predictions. Like moderate mastery young adults, older adults used similarity-based generalization for novel cues, but they did not achieve the same level of patterning discrimination. In Experiment 2, young and older adults were trained to the same learning criterion. Older adults again showed deficits in patterning discrimination and, in contrast to young adults, even when they reported awareness of the patterning rule, they used only similarity-based generalization in their predictions for novel cues. These findings suggest that it is important to consider how the ability to code or use cue representations interacts with the requirements of the causal learning task. In particular, age differences in causal learning seem to be greatest for tasks that require rapid coding of configural representations to control associative interference between similar cues. Configural coding may also be related to the success of rule-based processes in these types of learning tasks. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  20. Accurate crop classification using hierarchical genetic fuzzy rule-based systems

    Science.gov (United States)

    Topaloglou, Charalampos A.; Mylonas, Stelios K.; Stavrakoudis, Dimitris G.; Mastorocostas, Paris A.; Theocharis, John B.

    2014-10-01

    This paper investigates the effectiveness of an advanced classification system for accurate crop classification using very high resolution (VHR) satellite imagery. Specifically, a recently proposed genetic fuzzy rule-based classification system (GFRBCS) is employed, namely, the Hierarchical Rule-based Linguistic Classifier (HiRLiC). HiRLiC's model comprises a small set of simple IF-THEN fuzzy rules, easily interpretable by humans. One of its most important attributes is that its learning algorithm requires minimum user interaction, since the most important learning parameters affecting the classification accuracy are determined by the learning algorithm automatically. HiRLiC is applied in a challenging crop classification task, using a SPOT5 satellite image over an intensively cultivated area in a lake-wetland ecosystem in northern Greece. A rich set of higher-order spectral and textural features is derived from the initial bands of the (pan-sharpened) image, resulting in an input space comprising 119 features. The experimental analysis proves that HiRLiC compares favorably to other interpretable classifiers of the literature, both in terms of structural complexity and classification accuracy. Its testing accuracy was very close to that obtained by complex state-of-the-art classification systems, such as the support vector machines (SVM) and random forest (RF) classifiers. Nevertheless, visual inspection of the derived classification maps shows that HiRLiC is characterized by higher generalization properties, providing more homogeneous classifications that the competitors. Moreover, the runtime requirements for producing the thematic map was orders of magnitude lower than the respective for the competitors.

  1. PRINCIPLES- AND RULES-BASED ACCOUNTING DEBATE. IMPLICATIONS FOR AN EMERGENT COUNTRY

    Directory of Open Access Journals (Sweden)

    Deaconu Adela

    2011-07-01

    Full Text Available By a qualitative analysis, this research observes whether a principles-based system or a mixed version of it with the rules-based system, applied in Romania - an emergent country - is appropriate taking into account the mentalities, the traditions, and other cultural elements that were typical of a rules-based system. We support the statement that, even if certain contextual variables are common to other developed countries, their environments significantly differ. To be effective, financial reporting must reflect the firm's context in which it is functioning. The research has a deductive approach based on the analysis of the cultural factors and their influence in the last years. For Romania it is argue a lower accounting professionalism associated with a low level of ambiguity tolerance. For the stage analysed in this study (after the year 2005 the professional reasoning - a proxy for the accounting professional behaviour - took into consideration the fiscal and legal requirements rather than the accounting principles and judgments. The research suggest that the Romanian accounting practice and the professionals are not fully prepared for a principles-based system environment, associated with the ability to find undisclosed events, facing ambiguity, identifying inferred relationships and using intuition, respectively working with uncertainty. We therefore reach the conclusion that in Romania institutional amendments affecting the professional expertise would be needed. The accounting regulations must be chosen with great caution and they must answer and/ or be adjusted, even if the process would be delayed, to national values, behaviour of companies and individual expertise and beliefs. Secondly, the benefits of applying accounting reasoning in this country may be enhanced through a better understanding of their content and through practical exercise. Here regulatory bodies may intervene for organizing professional training programs and acting

  2. TAKTAG Two-phase learning method for hybrid statistical/rule-based part-of-speech disambiguation

    CERN Document Server

    Lee, G; Shin, S; Lee, Geunbae; Lee, Jong-Hyeok; Shin, Sanghyun

    1995-01-01

    Both statistical and rule-based approaches to part-of-speech (POS) disambiguation have their own advantages and limitations. Especially for Korean, the narrow windows provided by hidden markov model (HMM) cannot cover the necessary lexical and long-distance dependencies for POS disambiguation. On the other hand, the rule-based approaches are not accurate and flexible to new tag-sets and languages. In this regard, the statistical/rule-based hybrid method that can take advantages of both approaches is called for the robust and flexible POS disambiguation. We present one of such method, that is, a two-phase learning architecture for the hybrid statistical/rule-based POS disambiguation, especially for Korean. In this method, the statistical learning of morphological tagging is error-corrected by the rule-based learning of Brill [1992] style tagger. We also design the hierarchical and flexible Korean tag-set to cope with the multiple tagging applications, each of which requires different tag-set. Our experiments s...

  3. A Methodology for Implementing Clinical Algorithms Using Expert-System and Database Tools

    OpenAIRE

    Rucker, Donald W.; Shortliffe, Edward H.

    1989-01-01

    The HyperLipid Advisory System is a combination of an expert system and a database that uses an augmented transition network methodology for implementing clinical algorithms. These algorithms exist as tables from which the separate expert-system rule base sequentially extracts the steps in the algorithm. The rule base assumes that the algorithm has a binary branching structure and models episodes of clinical care, but otherwise makes no assumption regarding the specific clinical domain. Hyper...

  4. The Applicability of the Density Rule of Pathwardhan and Kumer and the Rule Based on Linear Isopiestic Relation

    Institute of Scientific and Technical Information of China (English)

    胡玉峰

    2001-01-01

    The applicability of the density rule of Pathwardhan and Kumer and the rule based on the linear isopiestic relation is studied by comparison with experimental density data in the literature. Predicted and measured values for 18 electrolyte mixtures are compared. The two rules are good for mixtures with and without common ions, including those containing associating ions. The deviations of the rule based on the linear isopiestic relation are slightly higher for the mixtures involving very strong ion complexes, but the predictions are still quite satisfactory.The density rule of Pathwardhan and Kumer is more accurate for these mixtures. However, it is not applicable for mixtures containing non-electrolytes. The rule based on the linear isopiestic relation is extended to mixtures involving non-electrolytes. The predictions for the mixtures containing both electrolytes and non-electrolytes and the non-electrolyte mixtures are accurate. All these results indicate that this rule is a widely avvlicable approach.

  5. The Applicability of the Density Rule of Pathwardhan and Kumer and the Rule Based on Linear Isopiestic Relation

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The applicability of the density rule of Pathwardhan and Kumer and the rule based on the linear isopiestic relation is studied by comparison with experimental density data in the literature. Predicted and measured values for 18 electrolyte mixtures are compared. The two rules are good for mixtures with and without common ions, including those containing associating ions. The deviations of the rule based on the linear isopiestic relation are slightly higher for the mixtures involving very strong ion complexes, but the predictions are still quite satisfactory. The density rule of Pathwardhan and Kumer is more accurate for these mixtures. However, it is not applicable for mixtures containing non-electrolytes. The rule based on the linear isopiestic relation is extended to mixtures involving non-electrolytes. The predictions for the mixtures containing both electrolytes and non-electrolytes and the non-electrolyte mixtures are accurate. All these results indicate that this rule is a widely applicable approach.

  6. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    Science.gov (United States)

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  7. 基于改进置信规则库推理的分类方法%Classification Approach Based on Improved Belief Rule-Base Reasoning

    Institute of Scientific and Technical Information of China (English)

    叶青青; 杨隆浩; 傅仰耿; 陈晓聪

    2016-01-01

    通过引入置信规则库的线性组合方式,设定规则数等于分类数及改进个体匹配度的计算方法,提出了基于置信规则库推理的分类方法。比较传统的置信规则库推理方法,新方法中规则数的设置不依赖于问题的前件属性数量或候选值数量,仅与问题的分类数有关,保证了方法对于复杂问题的适用性。实验中,通过差分进化算法对置信规则库的规则权重、前件属性权重、属性候选值和评价等级的置信度进行参数学习,得到最优的参数组合。对3个常用的公共分类数据集进行测试,均获得理想的分类准确率,表明新分类方法合理有效。%This paper proposes a new classification approach based on improved belief rule-base reasoning by intro-ducing linear combinational mode, setting the number of rules based on the classifications and improving the method of calculating individual matching degree. Compared with the traditional belief rule-base inference methodology, the number of rules in the proposed method does not depend on the number of antecedent attributes or its referential values, and it is only related to classification number. In this way, the new method can ensure the applicability for complex problems. In the experiments, the differential evolution algorithm is applied to train parameters, including rule weights, attribute weights, referential values of antecedent attributes and belief degrees. Three commonly public datasets have been employed to validate the proposed method. And the classification results are proved to be ideal, which shows that the proposed method is reasonable and effective.

  8. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-level Rule-based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne; Uhrmacher, Adelinde

    2016-08-03

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  9. Using rule-based shot dose assignment in model-based MPC applications

    Science.gov (United States)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  10. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    CERN Document Server

    Ahrens, R; The ATLAS collaboration; Kalinin, S; Maettig, P; Sandhoff, M; dos Santos, T; Volkmer, F

    2012-01-01

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based “PanDA” job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job’s owner immideatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehaviour. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitorin...

  11. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  12. Rule-based learning of regular past tense in children with specific language impairment.

    Science.gov (United States)

    Smith-Lock, Karen M

    2015-01-01

    The treatment of children with specific language impairment was used as a means to investigate whether a single- or dual-mechanism theory best conceptualizes the acquisition of English past tense. The dual-mechanism theory proposes that regular English past-tense forms are produced via a rule-based process whereas past-tense forms of irregular verbs are stored in the lexicon. Single-mechanism theories propose that both regular and irregular past-tense verbs are stored in the lexicon. Five 5-year-olds with specific language impairment received treatment for regular past tense. The children were tested on regular past-tense production and third-person singular "s" twice before treatment and once after treatment, at eight-week intervals. Treatment consisted of one-hour play-based sessions, once weekly, for eight weeks. Crucially, treatment focused on different lexical items from those in the test. Each child demonstrated significant improvement on the untreated past-tense test items after treatment, but no improvement on the untreated third-person singular "s". Generalization to untreated past-tense verbs could not be attributed to a frequency effect or to phonological similarity of trained and tested items. It is argued that the results are consistent with a dual-mechanism theory of past-tense inflection.

  13. Model Servqual Rule Base Asean University Network untuk Penilaian Kualitas Program Studi

    Directory of Open Access Journals (Sweden)

    Esti Wijayanti

    2016-05-01

    Full Text Available As well known that AUN (Asean University Network.AUN and ABET (Accreditation Boardb for Enginnering and Technology are non-profit organitatinon which have. AUN (Asean University Network were using variable with refer to AUN’s criteria’s there consist of fifteen which are: Expected Learning Outcomes, Programme Specification, Programme Structure and Content, Teaching and Learning Strategy, Student Assessment, Academic Staff Quality, Support Staff Quality, Student Quality, Student Advice and Support, Facilities and Infrastructure, Quality Assurance of Teaching/Learning Process, Staff Development Activities, Stakeholders Feedback, Output, Stakeholders Satisfaction,and adopted score's scale 7. In there here, we discuss the fifteen AUN’s of AUN in the criterias. There servqual of as can be into five dimensions, assurance, empathy, responsive, reliability and facilty in order to make the assessment's process easier. This research outcome indicated that this proposed method can be used to evaluate an education program. The validation result by using AUN's data and the analysis of servqual rule base Asean University Network almost have the same pattern with correlation value is 0,985 and this is can be accepted because its validity have reach 97%.

  14. Knowledge Representation and Inference for Analysis and Design of Database and Tabular Rule-Based Systems

    Directory of Open Access Journals (Sweden)

    Antoni Ligeza

    2001-01-01

    Full Text Available Rulebased systems constitute a powerful tool for specification of knowledge in design and implementation of knowledge based systems. They provide also a universal programming paradigm for domains such as intelligent control, decision support, situation classification and operational knowledge encoding. In order to assure safe and reliable performance, such system should satisfy certain formal requirements, including completeness and consistency. This paper addresses the issue of analysis and verification of selected properties of a class of such system in a systematic way. A uniform, tabular scheme of single-level rule-based systems is considered. Such systems can be applied as a generalized form of databases for specification of data pattern (unconditional knowledge, or can be used for defining attributive decision tables (conditional knowledge in form of rules. They can also serve as lower-level components of a hierarchical multi-level control and decision support knowledge-based systems. An algebraic knowledge representation paradigm using extended tabular representation, similar to relational database tables is presented and algebraic bases for system analysis, verification and design support are outlined.

  15. On Equivalence of FIS and ELM for Interpretable Rule-Based Knowledge Representation.

    Science.gov (United States)

    Wong, Shen Yuong; Yap, Keem Siah; Yap, Hwa Jen; Tan, Shing Chiang; Chang, Siow Wee

    2015-07-01

    This paper presents a fuzzy extreme learning machine (F-ELM) that embeds fuzzy membership functions and rules into the hidden layer of extreme learning machine (ELM). Similar to the concept of ELM that employed the random initialization technique, three parameters of F-ELM are randomly assigned. They are the standard deviation of the membership functions, matrix-C (rule-combination matrix), and matrix-D [don't care (DC) matrix]. Fuzzy if-then rules are formulated by the rule-combination Matrix of F-ELM, and a DC approach is adopted to minimize the number of input attributes in the rules. Furthermore, F-ELM utilizes the output weights of the ELM to form the target class and confidence factor for each of the rules. This is to indicate that the corresponding consequent parameters are determined analytically. The operations of F-ELM are equivalent to a fuzzy inference system. Several benchmark data sets and a real world fault detection and diagnosis problem have been used to empirically evaluate the efficacy of the proposed F-ELM in handling pattern classification tasks. The results show that the accuracy rates of F-ELM are comparable (if not superior) to ELM with distinctive ability of providing explicit knowledge in the form of interpretable rule base.

  16. An Expert System for Diagnosis of Sleep Disorder Using Fuzzy Rule-Based Classification Systems

    Science.gov (United States)

    Septem Riza, Lala; Pradini, Mila; Fitrajaya Rahman, Eka; Rasim

    2017-03-01

    Sleep disorder is an anomaly that could cause problems for someone’ sleeping pattern. Nowadays, it becomes an issue since people are getting busy with their own business and have no time to visit the doctors. Therefore, this research aims to develop a system used for diagnosis of sleep disorder using Fuzzy Rule-Based Classification System (FRBCS). FRBCS is a method based on the fuzzy set concepts. It consists of two steps: (i) constructing a model/knowledge involving rulebase and database, and (ii) prediction over new data. In this case, the knowledge is obtained from experts whereas in the prediction stage, we perform fuzzification, inference, and classification. Then, a platform implementing the method is built with a combination between PHP and the R programming language using the “Shiny” package. To validate the system that has been made, some experiments have been done using data from a psychiatric hospital in West Java, Indonesia. Accuracy of the result and computation time are 84.85% and 0.0133 seconds, respectively.

  17. Mining Interesting Positive and Negative Association Rule Based on Improved Genetic Algorithm (MIPNAR_GA

    Directory of Open Access Journals (Sweden)

    Nikky Suryawanshi Rai

    2014-01-01

    Full Text Available Association Rule mining is very efficient technique for finding strong relation between correlated data. The correlation of data gives meaning full extraction process. For the mining of positive and negative rules, a variety of algorithms are used such as Apriori algorithm and tree based algorithm. A number of algorithms are wonder performance but produce large number of negative association rule and also suffered from multi-scan problem. The idea of this paper is to eliminate these problems and reduce large number of negative rules. Hence we proposed an improved approach to mine interesting positive and negative rules based on genetic and MLMS algorithm. In this method we used a multi-level multiple support of data table as 0 and 1. The divided process reduces the scanning time of database. The proposed algorithm is a combination of MLMS and genetic algorithm. This paper proposed a new algorithm (MIPNAR_GA for mining interesting positive and negative rule from frequent and infrequent pattern sets. The algorithm is accomplished in to three phases: a.Extract frequent and infrequent pattern sets by using apriori method b.Efficiently generate positive and negative rule. c.Prune redundant rule by applying interesting measures. The process of rule optimization is performed by genetic algorithm and for evaluation of algorithm conducted the real world dataset such as heart disease data and some standard data used from UCI machine learning repository.

  18. Transfer in Rule-Based Category Learning Depends on the Training Task.

    Science.gov (United States)

    Kattner, Florian; Cox, Christopher R; Green, C Shawn

    2016-01-01

    While learning is often highly specific to the exact stimuli and tasks used during training, there are cases where training results in learning that generalizes more broadly. It has been previously argued that the degree of specificity can be predicted based upon the learning solution(s) dictated by the particular demands of the training task. Here we applied this logic in the domain of rule-based categorization learning. Participants were presented with stimuli corresponding to four different categories and were asked to perform either a category discrimination task (which permits learning specific rule to discriminate two categories) or a category identification task (which does not permit learning a specific discrimination rule). In a subsequent transfer stage, all participants were asked to discriminate stimuli belonging to two of the categories which they had seen, but had never directly discriminated before (i.e., this particular discrimination was omitted from training). As predicted, learning in the category-discrimination tasks tended to be specific, while the category-identification task produced learning that transferred to the transfer discrimination task. These results suggest that the discrimination and identification tasks fostered the acquisition of different category representations which were more or less generalizable.

  19. A Web-Based Rice Plant Expert System Using Rule-Based Reasoning

    Directory of Open Access Journals (Sweden)

    Anton Setiawan Honggowibowo

    2009-12-01

    Full Text Available Rice plants can be attacked by various kinds of diseases which are possible to be determined from their symptoms. However, it is to recognize that to find out the exact type of disease, an agricultural expert’s opinion is needed, meanwhile the numbers of agricultural experts are limited and there are too many problems to be solved at the same time. This makes a system with a capability as an expert is required. This system must contain the knowledge of the diseases and symptom of rice plants as an agricultural expert has to have. This research designs a web-based expert system using rule-based reasoning. The rule are modified from the method of forward chaining inference and backward chaining in order to to help farmers in the rice plant disease diagnosis. The web-based rice plants disease diagnosis expert system has the advantages to access and use easily. With web-based features inside, it is expected that the farmer can accesse the expert system everywhere to overcome the problem to diagnose rice diseases.

  20. A rule-based expert system for chemical prioritization using effects-based chemical categories.

    Science.gov (United States)

    Schmieder, P K; Kolanczyk, R C; Hornung, M W; Tapper, M A; Denny, J S; Sheedy, B R; Aladjov, H

    2014-01-01

    A rule-based expert system (ES) was developed to predict chemical binding to the estrogen receptor (ER) patterned on the research approaches championed by Gilman Veith to whom this article and journal issue are dedicated. The ERES was built to be mechanistically transparent and meet the needs of a specific application, i.e. predict for all chemicals within two well-defined inventories (industrial chemicals used as pesticide inerts and antimicrobial pesticides). These chemicals all lack structural features associated with high affinity binders and thus any binding should be low affinity. Similar to the high-quality fathead minnow database upon which Veith QSARs were built, the ERES was derived from what has been termed gold standard data, systematically collected in assays optimized to detect even low affinity binding and maximizing confidence in the negatives determinations. The resultant logic-based decision tree ERES, determined to be a robust model, contains seven major nodes with multiple effects-based chemicals categories within each. Predicted results are presented in the context of empirical data within local chemical structural groups facilitating informed decision-making. Even using optimized detection assays, the ERES applied to two inventories of >600 chemicals resulted in only ~5% of the chemicals predicted to bind ER.

  1. Perceptual learning improves adult amblyopic vision through rule-based cognitive compensation.

    Science.gov (United States)

    Zhang, Jun-Yun; Cong, Lin-Juan; Klein, Stanley A; Levi, Dennis M; Yu, Cong

    2014-04-01

    We investigated whether perceptual learning in adults with amblyopia could be enabled to transfer completely to an orthogonal orientation, which would suggest that amblyopic perceptual learning results mainly from high-level cognitive compensation, rather than plasticity in the amblyopic early visual brain. Nineteen adults (mean age = 22.5 years) with anisometropic and/or strabismic amblyopia were trained following a training-plus-exposure (TPE) protocol. The amblyopic eyes practiced contrast, orientation, or Vernier discrimination at one orientation for six to eight sessions. Then the amblyopic or nonamblyopic eyes were exposed to an orthogonal orientation via practicing an irrelevant task. Training was first performed at a lower spatial frequency (SF), then at a higher SF near the cutoff frequency of the amblyopic eye. Perceptual learning was initially orientation specific. However, after exposure to the orthogonal orientation, learning transferred to an orthogonal orientation completely. Reversing the exposure and training order failed to produce transfer. Initial lower SF training led to broad improvement of contrast sensitivity, and later higher SF training led to more specific improvement at high SFs. Training improved visual acuity by 1.5 to 1.6 lines (P learning suggests that perceptual learning in amblyopia may reflect high-level learning of rules for performing a visual discrimination task. These rules are applicable to new orientations to enable learning transfer. Therefore, perceptual learning may improve amblyopic vision mainly through rule-based cognitive compensation.

  2. Auto-control of pumping operations in sewerage systems by rule-based fuzzy neural networks

    Directory of Open Access Journals (Sweden)

    Y.-M. Chiang

    2010-09-01

    Full Text Available Pumping stations play an important role in flood mitigation in metropolitan areas. The existing sewerage systems, however, are facing a great challenge of fast rising peak flow resulting from urbanization and climate change. It is imperative to construct an efficient and accurate operating prediction model for pumping stations to simulate the drainage mechanism for discharging the rainwater in advance. In this study, we propose two rule-based fuzzy neural networks, adaptive neuro-fuzzy inference system (ANFIS and counterpropagatiom fuzzy neural network (CFNN for on-line predicting of the number of open and closed pumps of a pivotal pumping station in Taipei city up to a lead time of 20 min. The performance of ANFIS outperforms that of CFNN in terms of model efficiency, accuracy, and correctness. Furthermore, the results not only show the predictive water levels do contribute to the successfully operating pumping stations but also demonstrate the applicability and reliability of ANFIS in automatically controlling the urban sewerage systems.

  3. Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems.

    Science.gov (United States)

    Demšar, Jure; Lebar Bajec, Iztok

    2017-01-01

    Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question 'why,' however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour.

  4. A Rules-Based Service for Suggesting Visualizations to Analyze Earth Science Phenomena.

    Science.gov (United States)

    Prabhu, A.; Zednik, S.; Fox, P. A.; Ramachandran, R.; Maskey, M.; Shie, C. L.; Shen, S.

    2016-12-01

    Current Earth Science Information Systems lack support for new or interdisciplinary researchers, who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. We need to evolve the current information systems, to reduce the time required for data preparation, processing and analysis. This can be done by effectively salvaging the "dark" resources in Earth Science. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. In order to effectively use these dark resources, especially for data processing and visualization, we need a combination of domain, data product and processing knowledge, i.e. a knowledge base from which specific data operations can be performed. In this presentation, we describe a semantic, rules based approach to provide i.e. a service to visualize Earth Science phenomena, based on the data variables extracted using the "dark" metadata resources. We use Jena rules to make assertions about compatibility between a phenomena and various visualizations based on multiple factors. We created separate orthogonal rulesets to map each of these factors to the various phenomena. Some of the factors we have considered include measurements, spatial resolution and time intervals. This approach enables easy additions and deletions based on newly obtained domain knowledge or phenomena related information and thus improving the accuracy of the rules service overall.

  5. RFID sensor-tags feeding a context-aware rule-based healthcare monitoring system.

    Science.gov (United States)

    Catarinucci, Luca; Colella, Riccardo; Esposito, Alessandra; Tarricone, Luciano; Zappatore, Marco

    2012-12-01

    Along with the growing of the aging population and the necessity of efficient wellness systems, there is a mounting demand for new technological solutions able to support remote and proactive healthcare. An answer to this need could be provided by the joint use of the emerging Radio Frequency Identification (RFID) technologies and advanced software choices. This paper presents a proposal for a context-aware infrastructure for ubiquitous and pervasive monitoring of heterogeneous healthcare-related scenarios, fed by RFID-based wireless sensors nodes. The software framework is based on a general purpose architecture exploiting three key implementation choices: ontology representation, multi-agent paradigm and rule-based logic. From the hardware point of view, the sensing and gathering of context-data is demanded to a new Enhanced RFID Sensor-Tag. This new device, de facto, makes possible the easy integration between RFID and generic sensors, guaranteeing flexibility and preserving the benefits in terms of simplicity of use and low cost of UHF RFID technology. The system is very efficient and versatile and its customization to new scenarios requires a very reduced effort, substantially limited to the update/extension of the ontology codification. Its effectiveness is demonstrated by reporting both customization effort and performance results obtained from validation in two different healthcare monitoring contexts.

  6. Real-time fault detection method based on belief rule base for aircraft navigation system

    Institute of Scientific and Technical Information of China (English)

    Zhao Xin; Wang Shicheng; Zhang Jinsheng; Fan Zhiliang; Min Haibo

    2013-01-01

    Real-time and accurate fault detection is essential to enhance the aircraft navigation system's reliability and safety.The existent detection methods based on analytical model draws back at simultaneously detecting gradual and sudden faults.On account of this reason,we propose an online detection solution based on non-analytical model.In this article,the navigation system fault detection model is established based on belief rule base (BRB),where the system measuring residual and its changing rate are used as the inputs of BRB model and the fault detection function as the output.To overcome the drawbacks of current parameter optimization algorithms for BRB and achieve online update,a parameter recursive estimation algorithm is presented for online BRB detection model based on expectation maximization (EM) algorithm.Furthermore,the proposed method is verified by navigation experiment.Experimental results show that the proposed method is able to effectively realize online parameter evaluation in navigation system fault detection model.The output of the detection model can track the fault state very well,and the faults can be diagnosed in real time and accurately.In addition,the detection ability,especially in the probability of false detection,is superior to offline optimization method,and thus the system reliability has great improvement.

  7. Effective Rule Based Classifier using Multivariate Filter and Genetic Miner for Mammographic Image Classification

    Directory of Open Access Journals (Sweden)

    Nirase Fathima Abubacker

    2015-06-01

    Full Text Available Mammography is an important examination in the early detection of breast abnormalities. Automatic classifications of mammogram images into normal, benign or malignant would help the radiologists in diagnosis of breast cancer cases. This study investigates the effectiveness of using rule-based classifiers with multivariate filter and genetic miner to classify mammogram images. The method discovers association rules with the classes as the consequence and classifies the images based on the Highest Average Confidence of the association rules (HAvC matched for the classes. In the association rules mining stage, Correlation based Feature Selection (CFS plays an enormous significance to reduce the complexity of image mining process is used in this study as a feature selection method and a modified genetic association rule mining technique, the GARM, is used to discover the rules. The method is evaluated on mammogram image dataset with 240 images taken from DDSM. The performance of the method is compared against other classifiers such as SMO; Naïve Bayes and J48. The performance of the proposed method is promising with 88% accuracy and outperforms other classifiers in the context of mammogram image classification.

  8. Rule-based regulatory and metabolic model for Quorum sensing in P. aeruginosa.

    Science.gov (United States)

    Schaadt, Nadine S; Steinbach, Anke; Hartmann, Rolf W; Helms, Volkhard

    2013-08-21

    In the pathogen P. aeruginosa, the formation of virulence factors is regulated via Quorum sensing signaling pathways. Due to the increasing number of strains that are resistant to antibiotics, there is a high interest to develop novel antiinfectives. In the combat of resistant bacteria, selective blockade of the bacterial cell-to-cell communication (Quorum sensing) has gained special interest as anti-virulence strategy. Here, we modeled the las, rhl, and pqs Quorum sensing systems by a multi-level logical approach to analyze how enzyme inhibitors and receptor antagonists effect the formation of autoinducers and virulence factors. Our rule-based simulations fulfill the behavior expected from literature considering the external level of autoinducers. In the presence of PqsBCD inhibitors, the external HHQ and PQS levels are indeed clearly reduced. The magnitude of this effect strongly depends on the inhibition level. However, it seems that the pyocyanin pathway is incomplete. To match experimental observations we suggest a modified network topology in which PqsE and PqsR acts as receptors and an autoinducer as ligand that up-regulate pyocyanin in a concerted manner. While the PQS biosynthesis is more appropriate as target to inhibit the HHQ and PQS formation, blocking the receptor PqsR that regulates the biosynthesis reduces the pyocyanin level stronger.

  9. A self-organized, distributed, and adaptive rule-based induction system.

    Science.gov (United States)

    Rojanavasu, Pornthep; Dam, Hai Huong; Abbass, Hussein A; Lokan, Chris; Pinngern, Ouen

    2009-03-01

    Learning classifier systems (LCSs) are rule-based inductive learning systems that have been widely used in the field of supervised and reinforcement learning over the last few years. This paper employs sUpervised Classifier System (UCS), a supervised learning classifier system, that was introduced in 2003 for classification tasks in data mining. We present an adaptive framework of UCS on top of a self-organized map (SOM) neural network. The overall classification problem is decomposed adaptively and in real time by the SOM into subproblems, each of which is handled by a separate UCS. The framework is also tested with replacing UCS by a feedforward artificial neural network (ANN). Experiments on several synthetic and real data sets, including a very large real data set, show that the accuracy of classifications in the proposed distributed environment is as good or better than in the nondistributed environment, and execution is faster. In general, each UCS attached to a cell in the SOM has a much smaller population size than a single UCS working on the overall problem; since each data instance is exposed to a smaller population size than in the single population approach, the throughput of the overall system increases. The experiments show that the proposed framework can decompose a problem adaptively into subproblems, maintaining or improving accuracy and increasing speed.

  10. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    CERN Document Server

    Ahrens, R; The ATLAS collaboration; Kalinin, S; Maettig, P; Sandhoff, M; dos Santos, T; Volkmer, F

    2012-01-01

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based “PanDA” job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job’s owner immideatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehaviour. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitorin...

  11. Dynamic Querying of Mass-Storage RDF Data with Rule-Based Entailment Regimes

    Science.gov (United States)

    Ianni, Giovambattista; Krennwallner, Thomas; Martello, Alessandra; Polleres, Axel

    RDF Schema (RDFS) as a lightweight ontology language is gaining popularity and, consequently, tools for scalable RDFS inference and querying are needed. SPARQL has become recently a W3C standard for querying RDF data, but it mostly provides means for querying simple RDF graphs only, whereas querying with respect to RDFS or other entailment regimes is left outside the current specification. In this paper, we show that SPARQL faces certain unwanted ramifications when querying ontologies in conjunction with RDF datasets that comprise multiple named graphs, and we provide an extension for SPARQL that remedies these effects. Moreover, since RDFS inference has a close relationship with logic rules, we generalize our approach to select a custom ruleset for specifying inferences to be taken into account in a SPARQL query. We show that our extensions are technically feasible by providing benchmark results for RDFS querying in our prototype system GiaBATA, which uses Datalog coupled with a persistent Relational Database as a back-end for implementing SPARQL with dynamic rule-based inference. By employing different optimization techniques like magic set rewriting our system remains competitive with state-of-the-art RDFS querying systems.

  12. Classification of a set of vectors using self-organizing map- and rule-based technique

    Science.gov (United States)

    Ae, Tadashi; Okaniwa, Kaishirou; Nosaka, Kenzaburou

    2005-02-01

    There exist various objects, such as pictures, music, texts, etc., around our environment. We have a view for these objects by looking, reading or listening. Our view is concerned with our behaviors deeply, and is very important to understand our behaviors. We have a view for an object, and decide the next action (data selection, etc.) with our view. Such a series of actions constructs a sequence. Therefore, we propose a method which acquires a view as a vector from several words for a view, and apply the vector to sequence generation. We focus on sequences of the data of which a user selects from a multimedia database containing pictures, music, movie, etc... These data cannot be stereotyped because user's view for them changes by each user. Therefore, we represent the structure of the multimedia database as the vector representing user's view and the stereotyped vector, and acquire sequences containing the structure as elements. Such a vector can be classified by SOM (Self-Organizing Map). Hidden Markov Model (HMM) is a method to generate sequences. Therefore, we use HMM of which a state corresponds to the representative vector of user's view, and acquire sequences containing the change of user's view. We call it Vector-state Markov Model (VMM). We introduce the rough set theory as a rule-base technique, which plays a role of classifying the sets of data such as the sets of "Tour".

  13. Auto-control of pumping operations in sewerage systems by rule-based fuzzy neural networks

    Directory of Open Access Journals (Sweden)

    Y.-M. Chiang

    2011-01-01

    Full Text Available Pumping stations play an important role in flood mitigation in metropolitan areas. The existing sewerage systems, however, are facing a great challenge of fast rising peak flow resulting from urbanization and climate change. It is imperative to construct an efficient and accurate operating prediction model for pumping stations to simulate the drainage mechanism for discharging the rainwater in advance. In this study, we propose two rule-based fuzzy neural networks, adaptive neuro-fuzzy inference system (ANFIS and counterpropagation fuzzy neural network for on-line predicting of the number of open and closed pumps of a pivotal pumping station in Taipei city up to a lead time of 20 min. The performance of ANFIS outperforms that of CFNN in terms of model efficiency, accuracy, and correctness. Furthermore, the results not only show the predictive water levels do contribute to the successfully operating pumping stations but also demonstrate the applicability and reliability of ANFIS in automatically controlling the urban sewerage systems.

  14. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    Science.gov (United States)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  15. APLIKASI QUESTION ANSWERING SYSTEM DENGAN METODE RULE-BASED QUESTION ANSWERING SYSTEM PADA ALKITAB

    Directory of Open Access Journals (Sweden)

    Andreas Handojo

    2012-01-01

    Full Text Available The Bible as the holy book of Christians who are very close to the religious life and as a moral guide for Christians. So the Bible become a necessity when a christians want to search for for specific data or information. But sometimes to find the answer to a question people sometimes having a trouble, because people did not know how to find the answer that they are looking for at the verses in the Bible that’s relatively large of amount. Therefore an application that have an ability to provide answers from the Bible verses that have the possibility of answers to questions raised by the user is needed. Where users can enter questions using keyword when, where, why, whom and what. Question Answering System Application will operate on a digital Bible in Indonesian language by using Rule-Based Question Answering System and created using Visual Basic 6.0 and Microsoft Access 2003 database. Based on application testing that made, the aplication has been able to find answers to the questions that asked according to the keywords. Meanwhile, based on testing with the questionnaire, the application obtained an average percentage of 77.2% from the respondents.

  16. Rule-based Cross-matching of Very Large Catalogs in NED

    CERN Document Server

    Ogle, Patrick M; Ebert, Rick; Fadda, Dario; Lo, Tak; Terek, Scott; Schmitz, Marion

    2015-01-01

    The NASA/IPAC Extragalactic Database (NED) has deployed a new rule-based cross-matching algorithm called Match Expert (MatchEx), capable of cross-matching very large catalogs (VLCs) with >10 million objects. MatchEx goes beyond traditional position-based cross-matching algorithms by using other available data together with expert logic to determine which candidate match is the best. Furthermore, the local background density of sources is used to determine and minimize the false-positive match rate and to estimate match completeness. The logical outcome and statistical probability of each match decision is stored in the database, and may be used to tune the algorithm and adjust match parameter thresholds. For our first production run, we cross-matched the GALEX All Sky Survey Catalog (GASC), containing nearly 40 million NUV-detected sources, against a directory of 180 million objects in NED. Candidate matches were identified for each GASC source within a 7.5 arcsecond radius. These candidates were filtered on ...

  17. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    Science.gov (United States)

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  18. Automatic de-identification of French clinical records: comparison of rule-based and machine-learning approaches.

    Science.gov (United States)

    Grouin, Cyril; Zweigenbaum, Pierre

    2013-01-01

    In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.

  19. Rule-based expert system to establish the linkage between yarn twist factor and end-use.

    CSIR Research Space (South Africa)

    Dlodlo, N

    2009-09-01

    Full Text Available This paper describes the concepts and development of a rule-based expert system to establish the optimum linkage between the yarn twist factor and end-use of a yarn and determine the appropriate twist for the particular yarn. The quality of a yarn...

  20. Fault tolerant synchronization of chaotic heavy symmetric gyroscope systems versus external disturbances via Lyapunov rule-based fuzzy control.

    Science.gov (United States)

    Farivar, Faezeh; Shoorehdeli, Mahdi Aliyari

    2012-01-01

    In this paper, fault tolerant synchronization of chaotic gyroscope systems versus external disturbances via Lyapunov rule-based fuzzy control is investigated. Taking the general nature of faults in the slave system into account, a new synchronization scheme, namely, fault tolerant synchronization, is proposed, by which the synchronization can be achieved no matter whether the faults and disturbances occur or not. By making use of a slave observer and a Lyapunov rule-based fuzzy control, fault tolerant synchronization can be achieved. Two techniques are considered as control methods: classic Lyapunov-based control and Lyapunov rule-based fuzzy control. On the basis of Lyapunov stability theory and fuzzy rules, the nonlinear controller and some generic sufficient conditions for global asymptotic synchronization are obtained. The fuzzy rules are directly constructed subject to a common Lyapunov function such that the error dynamics of two identical chaotic motions of symmetric gyros satisfy stability in the Lyapunov sense. Two proposed methods are compared. The Lyapunov rule-based fuzzy control can compensate for the actuator faults and disturbances occurring in the slave system. Numerical simulation results demonstrate the validity and feasibility of the proposed method for fault tolerant synchronization.

  1. A comparison of rule-based and machine learning approaches for classifying patient portal messages.

    Science.gov (United States)

    Cronin, Robert M; Fabbri, Daniel; Denny, Joshua C; Rosenbloom, S Trent; Jackson, Gretchen Purcell

    2017-09-01

    Secure messaging through patient portals is an increasingly popular way that consumers interact with healthcare providers. The increasing burden of secure messaging can affect clinic staffing and workflows. Manual management of portal messages is costly and time consuming. Automated classification of portal messages could potentially expedite message triage and delivery of care. We developed automated patient portal message classifiers with rule-based and machine learning techniques using bag of words and natural language processing (NLP) approaches. To evaluate classifier performance, we used a gold standard of 3253 portal messages manually categorized using a taxonomy of communication types (i.e., main categories of informational, medical, logistical, social, and other communications, and subcategories including prescriptions, appointments, problems, tests, follow-up, contact information, and acknowledgement). We evaluated our classifiers' accuracies in identifying individual communication types within portal messages with area under the receiver-operator curve (AUC). Portal messages often contain more than one type of communication. To predict all communication types within single messages, we used the Jaccard Index. We extracted the variables of importance for the random forest classifiers. The best performing approaches to classification for the major communication types were: logistic regression for medical communications (AUC: 0.899); basic (rule-based) for informational communications (AUC: 0.842); and random forests for social communications and logistical communications (AUCs: 0.875 and 0.925, respectively). The best performing classification approach of classifiers for individual communication subtypes was random forests for Logistical-Contact Information (AUC: 0.963). The Jaccard Indices by approach were: basic classifier, Jaccard Index: 0.674; Naïve Bayes, Jaccard Index: 0.799; random forests, Jaccard Index: 0.859; and logistic regression, Jaccard

  2. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme

    Science.gov (United States)

    Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali

    2017-01-01

    With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public’s feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users’ reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users’ reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods. PMID:28231286

  3. Rule-based models of the interplay between genetic and environmental factors in childhood allergy.

    Directory of Open Access Journals (Sweden)

    Susanne Bornelöv

    Full Text Available Both genetic and environmental factors are important for the development of allergic diseases. However, a detailed understanding of how such factors act together is lacking. To elucidate the interplay between genetic and environmental factors in allergic diseases, we used a novel bioinformatics approach that combines feature selection and machine learning. In two materials, PARSIFAL (a European cross-sectional study of 3113 children and BAMSE (a Swedish birth-cohort including 2033 children, genetic variants as well as environmental and lifestyle factors were evaluated for their contribution to allergic phenotypes. Monte Carlo feature selection and rule based models were used to identify and rank rules describing how combinations of genetic and environmental factors affect the risk of allergic diseases. Novel interactions between genes were suggested and replicated, such as between ORMDL3 and RORA, where certain genotype combinations gave odds ratios for current asthma of 2.1 (95% CI 1.2-3.6 and 3.2 (95% CI 2.0-5.0 in the BAMSE and PARSIFAL children, respectively. Several combinations of environmental factors appeared to be important for the development of allergic disease in children. For example, use of baby formula and antibiotics early in life was associated with an odds ratio of 7.4 (95% CI 4.5-12.0 of developing asthma. Furthermore, genetic variants together with environmental factors seemed to play a role for allergic diseases, such as the use of antibiotics early in life and COL29A1 variants for asthma, and farm living and NPSR1 variants for allergic eczema. Overall, combinations of environmental and life style factors appeared more frequently in the models than combinations solely involving genes. In conclusion, a new bioinformatics approach is described for analyzing complex data, including extensive genetic and environmental information. Interactions identified with this approach could provide useful hints for further in-depth studies

  4. Using rule-based natural language processing to improve disease normalization in biomedical text.

    Science.gov (United States)

    Kang, Ning; Singh, Bharat; Afzal, Zubair; van Mulligen, Erik M; Kors, Jan A

    2013-01-01

    In order for computers to extract useful information from unstructured text, a concept normalization system is needed to link relevant concepts in a text to sources that contain further information about the concept. Popular concept normalization tools in the biomedical field are dictionary-based. In this study we investigate the usefulness of natural language processing (NLP) as an adjunct to dictionary-based concept normalization. We compared the performance of two biomedical concept normalization systems, MetaMap and Peregrine, on the Arizona Disease Corpus, with and without the use of a rule-based NLP module. Performance was assessed for exact and inexact boundary matching of the system annotations with those of the gold standard and for concept identifier matching. Without the NLP module, MetaMap and Peregrine attained F-scores of 61.0% and 63.9%, respectively, for exact boundary matching, and 55.1% and 56.9% for concept identifier matching. With the aid of the NLP module, the F-scores of MetaMap and Peregrine improved to 73.3% and 78.0% for boundary matching, and to 66.2% and 69.8% for concept identifier matching. For inexact boundary matching, performances further increased to 85.5% and 85.4%, and to 73.6% and 73.3% for concept identifier matching. We have shown the added value of NLP for the recognition and normalization of diseases with MetaMap and Peregrine. The NLP module is general and can be applied in combination with any concept normalization system. Whether its use for concept types other than disease is equally advantageous remains to be investigated.

  5. Rule-based Approach on Extraction of Malay Compound Nouns in Standard Malay Document

    Science.gov (United States)

    Abu Bakar, Zamri; Kamal Ismail, Normaly; Rawi, Mohd Izani Mohamed

    2017-08-01

    Malay compound noun is defined as a form of words that exists when two or more words are combined into a single syntax and it gives a specific meaning. Compound noun acts as one unit and it is spelled separately unless an established compound noun is written closely from two words. The basic characteristics of compound noun can be seen in the Malay sentences which are the frequency of that word in the text itself. Thus, this extraction of compound nouns is significant for the following research which is text summarization, grammar checker, sentiments analysis, machine translation and word categorization. There are many research efforts that have been proposed in extracting Malay compound noun using linguistic approaches. Most of the existing methods were done on the extraction of bi-gram noun+noun compound. However, the result still produces some problems as to give a better result. This paper explores a linguistic method for extracting compound Noun from stand Malay corpus. A standard dataset are used to provide a common platform for evaluating research on the recognition of compound Nouns in Malay sentences. Therefore, an improvement for the effectiveness of the compound noun extraction is needed because the result can be compromised. Thus, this study proposed a modification of linguistic approach in order to enhance the extraction of compound nouns processing. Several pre-processing steps are involved including normalization, tokenization and tagging. The first step that uses the linguistic approach in this study is Part-of-Speech (POS) tagging. Finally, we describe several rules-based and modify the rules to get the most relevant relation between the first word and the second word in order to assist us in solving of the problems. The effectiveness of the relations used in our study can be measured using recall, precision and F1-score techniques. The comparison of the baseline values is very essential because it can provide whether there has been an improvement

  6. Using Rule Base System in Mobile Platform to Build Alert System for Evacuation and Guidance

    Directory of Open Access Journals (Sweden)

    Maysoon Fouad Abulkhair

    2016-04-01

    Full Text Available The last few years have witnessed the widespread use of mobile technology. Billions of citizens around the world own smartphones, which they use for both personal and business applications. Thus, technologies will minimize the risk of losing people's lives. Mobile platform is one of the most popular plat-form technologies utilized on a wide scale and accessible to a high number of people. There has been a huge increase in natural and manmade disasters in the last few years. Such disasters can hap-pen anytime and anywhere causing major damage to people and property. The environment affluence and the failure of people to go to other safe places are the results of catastrophic events re-cently in Jeddah city. Flood causes the sinking and destruction of homes and private properties. Thus, this paper describes a sys-tem that can help in determining the affected properties, evacuat-ing them, and providing a proper guidance to the registered users in the system. This system notifies mobile phone users by sending guidance messages and sound alerts, in a real-time when disasters (fires, floods hit. Warnings and tips are received on the mobile user to teach him/her how to react before, during, and after the disaster. Provide a mobile application using GPS to determine the user location and guide the user for the best way with the aid of rule-based system that built through the interview with the Experts domains. Moreover, the user will re-ceive Google map updates for any added information. This sys-tem consists of two subsystems: the first helps students in our university to evacuate during a catastrophe and the second aids all people in the city. Due to all these features, the system can access the required information at the needed time.

  7. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    Science.gov (United States)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  8. Hierarchical rule-based monitoring and fuzzy logic control for neuromuscular block.

    Science.gov (United States)

    Shieh, J S; Fan, S Z; Chang, L W; Liu, C C

    2000-01-01

    The important task for anaesthetists is to provide an adequate degree of neuromuscular block during surgical operations, so that it should not be difficult to antagonize at the end of surgery. Therefore, this study examined the application of a simple technique (i.e., fuzzy logic) to an almost ideal muscle relaxant (i.e., rocuronium) at general anaesthesia in order to control the system more easily, efficiently, intelligently and safely during an operation. The characteristics of neuromuscular blockade induced by rocuronium were studied in 10 ASA I or II adult patients anaesthetized with inhalational (i.e., isoflurane) anaesthesia. A Datex Relaxograph was used to monitor neuromuscular block. And, ulnar nerve was stimulated supramaximally with repeated train-of-four via surface electrodes at the wrist. Initially a notebook personal computer was linked to a Datex Relaxograph to monitor electromyogram (EMG) signals which had been pruned by a three-level hierarchical structure of filters in order to design a controller for administering muscle relaxants. Furthermore, a four-level hierarchical fuzzy logic controller using the fuzzy logic and rule of thumb concept has been incorporated into the system. The Student's test was used to compare the variance between the groups. p control of muscle relaxation with a mean T1% error of -0.19 (SD 0.66) % accommodating a range in mean infusion rate (MIR) of 0.21-0.49 mg x kg(-1) x h(-1). When these results were compared with our previous ones using the same hierarchical structure applied to mivacurium, less variation in the T1% error (p controller activity of these two drugs showed no significant difference (p > 0.5). However, the consistent medium coefficient variance (CV) of the MIR of both rocuronium (i.e., 36.13 (SD 9.35) %) and mivacurium (i.e., 34.03 (SD 10.76) %) indicated a good controller activity. The results showed that a hierarchical rule-based monitoring and fuzzy logic control architecture can provide stable control

  9. A Fuzzy Rule-Base Model for Classification of Spirometric FVC Graphs in Chronical Obstructive Pulmonary Diseases

    Science.gov (United States)

    2007-11-02

    of distinguishing COPD group diseases (chronic bronchitis, emphysema and asthma ) by using fuzzy theory and to put into practice a “fuzzy rule-base...FVC Plots”. Keywords - asthma , chronic bronchitis, COPD (Chronic Obstructive Pulmonary Disease), emphysema , expert systems, FVC (forced vital...the group of chronic bronchitis, emphysema and asthma because of these reasons [4-7]. Additionally, similar symptoms may cause fuzziness in

  10. Fuzzy rule-based prediction of lovastatin productivity in continuous mode using pellets of Aspergillus terreus in an airlift reactor

    Directory of Open Access Journals (Sweden)

    Kamakshi Gupta

    2009-12-01

    Full Text Available Lovastatin production using pellets of Aspergillus terreus was investigated in an airlift reactor. A fuzzy system has been developed for predicting the lovastatin productivity. Analysis of the effect of dilution rate and biomass concentration on the productivity of lovastatin was carried out and hence these were taken as inputs for the fuzzy system. The rule base has been developed using the conceptions of developmental processes in lovastatin production. The fuzzy system has been constructed on the basis of experimental results and operator’s knowledge. The values predicted for lovastatin productivity by the fuzzy system has been compared with the experimental data. The R squared value and mean squared error has been calculated to evaluate the quality of the fuzzy system. The performance measures show that the rule-based results of the fuzzy system is in accordance with the experimental results. The utilization of fuzzy system aided in the increase of lovastatin productivity by about 1.3 times when compared to previous empirical experimental results. Keywords: Lovastatin, airlift reactor, fuzzy rule-based system, Aspergillus terreus, continuous fermentation, pellets. Received: 27 November 2009 / Received in revised form: 18 January 2010, Accepted: 11 February 2010, Published online: 23 March 2010

  11. Functional Network Construction in Arabidopsis Using Rule-Based Machine Learning on Large-Scale Data Sets[C][W][OA

    Science.gov (United States)

    Bassel, George W.; Glaab, Enrico; Marquez, Julietta; Holdsworth, Michael J.; Bacardit, Jaume

    2011-01-01

    The meta-analysis of large-scale postgenomics data sets within public databases promises to provide important novel biological knowledge. Statistical approaches including correlation analyses in coexpression studies of gene expression have emerged as tools to elucidate gene function using these data sets. Here, we present a powerful and novel alternative methodology to computationally identify functional relationships between genes from microarray data sets using rule-based machine learning. This approach, termed “coprediction,” is based on the collective ability of groups of genes co-occurring within rules to accurately predict the developmental outcome of a biological system. We demonstrate the utility of coprediction as a powerful analytical tool using publicly available microarray data generated exclusively from Arabidopsis thaliana seeds to compute a functional gene interaction network, termed Seed Co-Prediction Network (SCoPNet). SCoPNet predicts functional associations between genes acting in the same developmental and signal transduction pathways irrespective of the similarity in their respective gene expression patterns. Using SCoPNet, we identified four novel regulators of seed germination (ALTERED SEED GERMINATION5, 6, 7, and 8), and predicted interactions at the level of transcript abundance between these novel and previously described factors influencing Arabidopsis seed germination. An online Web tool to query SCoPNet has been developed as a community resource to dissect seed biology and is available at http://www.vseed.nottingham.ac.uk/. PMID:21896882

  12. A simple rule based model for scheduling farm management operations in SWAT

    Science.gov (United States)

    Schürz, Christoph; Mehdi, Bano; Schulz, Karsten

    2016-04-01

    For many interdisciplinary questions at the watershed scale, the Soil and Water Assessment Tool (SWAT; Arnold et al., 1998) has become an accepted and widely used tool. Despite its flexibility, the model is highly demanding when it comes to input data. At SWAT's core the water balance and the modeled nutrient cycles are plant growth driven (implemented with the EPIC crop growth model). Therefore, land use and crop data with high spatial and thematic resolution, as well as detailed information on cultivation and farm management practices are required. For many applications of the model however, these data are unavailable. In order to meet these requirements, SWAT offers the option to trigger scheduled farm management operations by applying the Potential Heat Unit (PHU) concept. The PHU concept solely takes into account the accumulation of daily mean temperature for management scheduling. Hence, it contradicts several farming strategies that take place in reality; such as: i) Planting and harvesting dates are set much too early or too late, as the PHU concept is strongly sensitivity to inter-annual temperature fluctuations; ii) The timing of fertilizer application, in SWAT this often occurs simultaneously on the same date in in each field; iii) and can also coincide with precipitation events. Particularly, the latter two can lead to strong peaks in modeled nutrient loads. To cope with these shortcomings we propose a simple rule based model (RBM) to schedule management operations according to realistic farmer management practices in SWAT. The RBM involves simple strategies requiring only data that are input into the SWAT model initially, such as temperature and precipitation data. The user provides boundaries of time periods for operation schedules to take place for all crops in the model. These data are readily available from the literature or from crop variety trials. The RBM applies the dates by complying with the following rules: i) Operations scheduled in the

  13. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  14. Conditioning of high voltage radio frequency cavities by using fuzzy logic in connection with rule based programming

    CERN Document Server

    Perréard, S

    1993-01-01

    Many processes are controlled by experts using some kind of mental model to decide actions and make conclusions. This model, based on heuristic knowledge, can often be conveniently represented in rules and has not to be particularly accurate. This is the case for the problem of conditioning high voltage radio-frequency cavities: the expert has to decide, by observing some criteria, if he can increase or if he has to decrease the voltage and by how much. A program has been implemented which can be applied to a class of similar problems. The kernel of the program is a small rule base, which is independent of the kind of cavity. To model a specific cavity, we use fuzzy logic which is implemented as a separate routine called by the rule base. We use fuzzy logic to translate from numeric to symbolic information. The example we chose for applying this kind of technique can be implemented by sequential programming. The two versions exist for comparison. However, we believe that this kind of programming can be powerf...

  15. Guidelines for Creating a Rule-Based Knowledge Learning System and Their Application to a Chinese Business Card Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    PAN Wumo; WANG Qingren

    2001-01-01

    Rule selection has long been a problem of great challenge that has to be solved when developing a rule-based knowledge learning system. Many methods have been proposed to evaluate the eligibility of a single rule based on some criteria. However, in a knowledge learning system there is usually a set of rules. These rules are not independent, but interactive. They tend to affect each other and form a rulesystem. In such case, it is no longer reasonable to isolate each rule from others for evaluation. A best rule according to certain criterion is not always the best one for the whole system. Furthermore, the data in the real world from which people want to create their learning system are often ill-defined and inconsistent. In this case, the completeness and consistency criteria for rule selection are no longer essential. In this paper, some ideas about how to solve the rule-selection problem in a systematic way are proposed. These ideas have been applied in the design of a Chinese business card layout analysis system and gained a good result on the training data set of 425 images. The implementation of the system and the result are presented in this paper.

  16. Performance Analysis of Extracted Rule-Base Multivariable Type-2 Self-Organizing Fuzzy Logic Controller Applied to Anesthesia

    Science.gov (United States)

    Fan, Shou-Zen; Shieh, Jiann-Shing

    2014-01-01

    We compare type-1 and type-2 self-organizing fuzzy logic controller (SOFLC) using expert initialized and pretrained extracted rule-bases applied to automatic control of anaesthesia during surgery. We perform experimental simulations using a nonfixed patient model and signal noise to account for environmental and patient drug interaction uncertainties. The simulations evaluate the performance of the SOFLCs in their ability to control anesthetic delivery rates for maintaining desired physiological set points for muscle relaxation and blood pressure during a multistage surgical procedure. The performances of the SOFLCs are evaluated by measuring the steady state errors and control stabilities which indicate the accuracy and precision of control task. Two sets of comparisons based on using expert derived and extracted rule-bases are implemented as Wilcoxon signed-rank tests. Results indicate that type-2 SOFLCs outperform type-1 SOFLC while handling the various sources of uncertainties. SOFLCs using the extracted rules are also shown to outperform those using expert derived rules in terms of improved control stability. PMID:25587533

  17. A Natural-Rule-Based-Connection (NRBC Method for River Network Extraction from High-Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Chuiqing Zeng

    2015-10-01

    Full Text Available This study proposed a natural-rule-based-connection (NRBC method to connect river segments after water body detection from remotely sensed imagery. A complete river network is important for many hydrological applications. While water body detection methods using remote sensing are well-developed, less attention has been paid to connect discontinuous river segments and form a complete river network. This study designed an automated NRBC method to extract a complete river network by connecting river segments at polygon level. With the assistance of an image pyramid, neighbouring river segments are connected based on four criteria: gap width (Tg, river direction consistency (Tθ, river width consistency (Tw, and minimum river segment length (Tl. The sensitivity of these four criteria were tested, analyzed, and proper criteria values were suggested using image scenes from two diverse river cases. The comparison of NRBC and the alternative morphological method demonstrated NRBC’s advantage of natural rule based selective connection. We refined a river centerline extraction method and show how it outperformed three other existing centerline extraction methods on the test sites. The extracted river polygons and centerlines have a multitude of end uses including rapidly mapping flood extents, monitoring surface water supply, and the provision of validation data for simulation models required for water quantity, quality and aquatic biota assessments. The code for the NRBC is available on GitHub.

  18. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  19. Desain Sistem Semantic Data Warehouse dengan Metode Ontology dan Rule Based untuk Mengolah Data Akademik Universitas XYZ di Bali

    Directory of Open Access Journals (Sweden)

    Made Pradnyana Ambara

    2016-06-01

    Full Text Available Data warehouse pada umumnya yang sering dikenal data warehouse tradisional mempunyai beberapa kelemahan yang mengakibatkan kualitas data yang dihasilkan tidak spesifik dan efektif. Sistem semantic data warehouse merupakan solusi untuk menangani permasalahan pada data warehouse tradisional dengan kelebihan antara lain: manajeman kualitas data yang spesifik dengan format data seragam untuk mendukung laporan OLAP yang baik, dan performance pencarian informasi yang lebih efektif dengan kata kunci bahasa alami. Pemodelan sistem semantic data warehouse menggunakan metode ontology menghasilkan model resource description framework schema (RDFS logic yang akan ditransformasikan menjadi snowflake schema. Laporan akademik yang dibutuhkan dihasilkan melalui metode nine step Kimball dan pencarian semantic menggunakan metode rule based. Pengujian dilakukan menggunakan dua metode uji yaitu pengujian dengan black box testing dan angket kuesioner cheklist. Dari hasil penelitian ini dapat disimpulkan bahwa sistem semantic data warehouse dapat membantu proses pengolahan data akademik yang menghasilkan laporan yang berkualitas untuk mendukung proses pengambilan keputusan.

  20. A Rule Based Energy Management System of Experimental Battery/Supercapacitor Hybrid Energy Storage System for Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Qiao Zhang

    2016-01-01

    Full Text Available In this paper, a simple and efficient rule based energy management system for battery and supercapacitor hybrid energy storage system (HESS used in electric vehicles is presented. The objective of the proposed energy management system is to focus on exploiting the supercapacitor characteristics and on increasing the battery lifetime and system efficiency. The role of the energy management system is to yield battery reference current, which is subsequently used by the controller of the DC/DC converter. First, a current controller is designed to realize load current distribution between battery and supercapacitor. Then a voltage controller is designed to ensure the supercapacitor SOC to fluctuate within a preset reasonable variation range. Finally, a commercial experimental platform is developed to verify the proposed control strategy. In addition, the energy efficiency and the cost analysis of the hybrid system are carried out based on the experimental results to explore the most cost-effective tradeoff.

  1. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Science.gov (United States)

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  2. Critical Stage Rule-Based Real Time Dispatch(RTD)System in Highly-Mixed-Products (HMP) FAB

    Institute of Scientific and Technical Information of China (English)

    YUXiao-hua; XIANGYu-qun

    2005-01-01

    An improving utilization and efficiency of critical equipments in semiconductor wafer fabrication facilities are concerned. Semiconductor manufacturing FAB is one of the most complicated and cost sensitive environments. A good dispatching tool will make big difference in equipment utilization and FAB output as a whole. The equipment in this paper is In-Line DUV Scanner.There are many factors impacting utilization and output on this equipment group. In HMP environment one of the issues is changing of reticule in this area and idle counts due to load unbalance between equipments. Here we'll introduce a rule-based RTD system which aiming at decreasing the number of recipe change and idle counts among a group of scanner equipment in a high-mixedproducts FAB.

  3. Critical Stage Rule-Based Real Time Dispatch(RTD) System in Highly-Mixed-Products (HMP) FAB

    Institute of Scientific and Technical Information of China (English)

    YU Xiao-hua; XIANG Yu-qun

    2005-01-01

    An improving utilization and efficiency of critical equipments in semiconductor wafer fabrication facilities are concerned. Semiconductor manufacturing FAB is one of the most omplicated and cost sensitive environments. A good dispatching tool will make big difference in equipment utilization and FAB output as a whole. The equipment in this paper is In-Line DUV Scanner.There are many factors impacting utilization and output on this equipment group. In HMP environment one of the issues is changing of reticule in this area and idle counts due to load unbalance between equipments. Here we'll introduce a rule-based RTD system which aiming at decreasing the number of recipe change and idle counts among a group of scanner equipment in a high-mixedproducts FAB.

  4. Transcranial infrared laser stimulation improves rule-based, but not information-integration, category learning in humans.

    Science.gov (United States)

    Blanco, Nathaniel J; Saucedo, Celeste L; Gonzalez-Lima, F

    2017-03-01

    This is the first randomized, controlled study comparing the cognitive effects of transcranial laser stimulation on category learning tasks. Transcranial infrared laser stimulation is a new non-invasive form of brain stimulation that shows promise for wide-ranging experimental and neuropsychological applications. It involves using infrared laser to enhance cerebral oxygenation and energy metabolism through upregulation of the respiratory enzyme cytochrome oxidase, the primary infrared photon acceptor in cells. Previous research found that transcranial infrared laser stimulation aimed at the prefrontal cortex can improve sustained attention, short-term memory, and executive function. In this study, we directly investigated the influence of transcranial infrared laser stimulation on two neurobiologically dissociable systems of category learning: a prefrontal cortex mediated reflective system that learns categories using explicit rules, and a striatally mediated reflexive learning system that forms gradual stimulus-response associations. Participants (n=118) received either active infrared laser to the lateral prefrontal cortex or sham (placebo) stimulation, and then learned one of two category structures-a rule-based structure optimally learned by the reflective system, or an information-integration structure optimally learned by the reflexive system. We found that prefrontal rule-based learning was substantially improved following transcranial infrared laser stimulation as compared to placebo (treatment X block interaction: F(1, 298)=5.117, p=0.024), while information-integration learning did not show significant group differences (treatment X block interaction: F(1, 288)=1.633, p=0.202). These results highlight the exciting potential of transcranial infrared laser stimulation for cognitive enhancement and provide insight into the neurobiological underpinnings of category learning.

  5. Comparison of model-based and expert-rule based electrocardiographic identification of the culprit artery in patients with acute coronary syndrome.

    Science.gov (United States)

    Kamphuis, Vivian P; Wagner, Galen S; Pahlm, Olle; Man, Sumche; Olson, Charles W; Bacharova, Ljuba; Swenne, Cees A

    2015-01-01

    Culprit coronary artery assessment in the triage ECG of patients with suspected acute coronary syndrome (ACS) is relevant a priori knowledge preceding percutaneous coronary intervention (PCI). We compared a model-based automated method (Olson method) with an expert-rule based method for the culprit artery assessment. In each of the 53 patients who were admitted with the working diagnosis of suspected ACS, scheduled for emergent angiography with a view on revascularization as initial treatment and subsequently found to have an angiographically documented completely occluded culprit artery, culprit artery location was assessed in the preceding ECG by both the model-based Olson method and the expert-rule based method that considered either visual or computer-measured J-point amplitudes. ECG culprit artery estimations were compared with the angiographic culprit lesion locations. Proportions of correct classifications were compared by a Z test at the 5% significance level. The Olson method performed slightly, but not significantly, better, when the expert-rule based method used visual assessment of J-point amplitudes (88.7% versus 81.1% correct; P=0.28). However, the Olson method performed significantly better when the expert-rule based method used computer-measured J-point amplitudes (88.7% versus 71.7% correct; P<0.05). The automated model-based Olson method performed at least at the level of expert cardiologists using a manual rule-based method. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Optimal Rule-Based Power Management for Online, Real-Time Applications in HEVs with Multiple Sources and Objectives: A Review

    Directory of Open Access Journals (Sweden)

    Bedatri Moulik

    2015-08-01

    Full Text Available The field of hybrid vehicles has undergone intensive research and development, primarily due to the increasing concern of depleting resources and increasing pollution. In order to investigate further options to optimize the performance of hybrid vehicles with regards to different criteria, such as fuel economy, battery aging, etc., a detailed state-of-the-art review is presented in this contribution. Different power management and optimization techniques are discussed focusing on rule-based power management and multi-objective optimization techniques. The extent of rule-based power management and optimization in solving battery aging issues is investigated along with an implementation in real-time driving scenarios where no pre-defined drive cycle is followed. The goal of this paper is to illustrate the significance and applications of rule-based power management optimization based on previous contributions.

  7. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  8. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  9. MODEL OF MOBILE TRANSLATOR APPLICATION OF ENGLISH TO BAHASA INDONESIA WITH RULE-BASED AND J2ME

    Directory of Open Access Journals (Sweden)

    Dian Puspita Tedjosurya

    2014-05-01

    Full Text Available Along with the development of information technology in recent era, a number of new applications emerge, especially on mobile phones. The use of mobile phones, besides as communication media, is also as media of learning, such as translator application. Translator application can be a tool to learn a language, such as English to Bahasa Indonesia translator application. The purpose of this research is to allow user to be able to translate English to Bahasa Indonesia on mobile phone easily. Translator application on this research was developed using Java programming language (especially J2ME because of its advantage that can run on various operating systems and its open source that can be easily developed and distributed. In this research, data collection was done through literature study, observation, and browsing similar application. Development of the system used object-oriented analysis and design that can be described by using case diagrams, class diagrams, sequence diagrams, and activity diagrams. The translation process used rule-based method. Result of this research is the application of Java-based translator which can translate English sentence into Indonesian sentence. The application can be accessed using a mobile phone with Internet connection. The application has spelling check feature that is able to check the wrong word and provide alternative word that approaches the word input. Conclusion of this research is the application can translate sentence in daily conversation quite well with the sentence structure corresponds and is close to its original meaning.

  10. The Relevance of a Rules-Based Freshmilk Price Structure Policy in East Java : An Evidence Based Assesment

    Directory of Open Access Journals (Sweden)

    Bambang Ali Nugroho

    2011-12-01

    Full Text Available At present, Indonesia is still lack of freshmilk supply, domestic freshmilk production is covered only 30% national freshmilk processing industry needed and is about 70% milk industry material should be imported, mainly from New Zealand, Australia, EU and USA. The following actors are active in the formal dairy supply chain in Indonesia: (1 Milk producers, (2 The primary dairy/ village cooperatives (KUD, (3 The overall dairy cooperative (GKSI, (4 The milk processors/ dairy industry, (5 Based on that situation, and to improve the development of smallscale dairy farming activities in east java, this paper have an objective to examine the relevance of a rules-based freshmilk price structure policy in east java. East java dairy supply chain inefficiencies are reflected in a relatively large difference between farm gate milk price and consumer prices of milk products. Factors like the dependency on imported milk powder and the strongly fluctuating world market prices, the lack of protection against world market fluctuations for the local milk producers, the scale and structure of dairy farming, and poor raw milk quality affect the development of east java dairy supply chain.

  11. Rule-based Mamdani-type fuzzy modelling of thermal performance of fintube evaporator under frost conditions

    Directory of Open Access Journals (Sweden)

    Ozen Dilek Nur

    2016-01-01

    Full Text Available Frost formation brings about insulating effects over the surface of a heat exchanger and thereby deteriorating total heat transfer of the heat exchanger. In this study, a fin-tube evaporator is modeled by making use of Rule-based Mamdani-Type Fuzzy (RBMTF logic where total heat transfer, air inlet temperature of 2 °C to 7 °C and four different fluid speed groups (ua1=1; 1.44; 1.88 m s-1, ua2=2.32; 2.76 m s-1, ua3=3.2; 3.64 m s-1, ua4=4.08; 4.52; 4.96 m s-1 for the evaporator were taken into consideration. In the developed RBMTF system, outlet parameter UA was determined using inlet parameters Ta and ua. The RBMTF was trained and tested by using MATLAB® fuzzy logic toolbox. R2 (% for the training data and test data were found to be 99.91%. With this study, it has been shown that RBMTF model can be reliably used in determination of a total heat transfer of a fin-tube evaporator.

  12. ALPHABET SIGN LANGUAGE RECOGNITION USING LEAP MOTION TECHNOLOGY AND RULE BASED BACKPROPAGATION-GENETIC ALGORITHM NEURAL NETWORK (RBBPGANN

    Directory of Open Access Journals (Sweden)

    Wijayanti Nurul Khotimah

    2017-01-01

    Full Text Available Sign Language recognition was used to help people with normal hearing communicate effectively with the deaf and hearing-impaired. Based on survey that conducted by Multi-Center Study in Southeast Asia, Indonesia was on the top four position in number of patients with hearing disability (4.6%. Therefore, the existence of Sign Language recognition is important. Some research has been conducted on this field. Many neural network types had been used for recognizing many kinds of sign languages. However, their performance are need to be improved. This work focuses on the ASL (Alphabet Sign Language in SIBI (Sign System of Indonesian Language which uses one hand and 26 gestures. Here, thirty four features were extracted by using Leap Motion. Further, a new method, Rule Based-Backpropagation Genetic Al-gorithm Neural Network (RB-BPGANN, was used to recognize these Sign Languages. This method is combination of Rule and Back Propagation Neural Network (BPGANN. Based on experiment this pro-posed application can recognize Sign Language up to 93.8% accuracy. It was very good to recognize large multiclass instance and can be solution of overfitting problem in Neural Network algorithm.

  13. A Comparison of the neural correlates that underlie rule-based and information-integration category learning.

    Science.gov (United States)

    Carpenter, Kathryn L; Wills, Andy J; Benattayallah, Abdelmalek; Milton, Fraser

    2016-10-01

    The influential competition between verbal and implicit systems (COVIS) model proposes that category learning is driven by two competing neural systems-an explicit, verbal, system, and a procedural-based, implicit, system. In the current fMRI study, participants learned either a conjunctive, rule-based (RB), category structure that is believed to engage the explicit system, or an information-integration category structure that is thought to preferentially recruit the implicit system. The RB and information-integration category structures were matched for participant error rate, the number of relevant stimulus dimensions, and category separation. Under these conditions, considerable overlap in brain activation, including the prefrontal cortex, basal ganglia, and the hippocampus, was found between the RB and information-integration category structures. Contrary to the predictions of COVIS, the medial temporal lobes and in particular the hippocampus, key regions for explicit memory, were found to be more active in the information-integration condition than in the RB condition. No regions were more activated in RB than information-integration category learning. The implications of these results for theories of category learning are discussed. Hum Brain Mapp 37:3557-3574, 2016. © 2016 Wiley Periodicals, Inc.

  14. 结合SOM的关联规则挖掘研究%Research on association rule based on SOM

    Institute of Scientific and Technical Information of China (English)

    景波; 刘莹; 陈耿

    2014-01-01

    为了实现在海量数据中的审计线索的快速发现,通过数据挖掘FMA算法对被审数据和审计专家经验库进行关联规则快速提取;再利用自组织神经网络改良CLARANS算法对审计专家经验库抽取的规则划分出相似规则群;然后通过对被审单位关联规则集合和专家经验的相似规则群进行相对强弱、趋近率和价值率的比较,最终得到审计线索集合。%In order to achieve the audit trail of the massive data quickly found through data mining FMA algorithms to quickly extract trial data and audit expertise library association rules;re-use of self-organizing neural network improved CLARANS algorithm to extract audit expertise library divide a similar rule base rules;then by trial set of association rules and expert experience similar rules group relative strength, the approach value and the different rate of comparing the resulting set of audit trail.

  15. A Rules-Based Approach for Configuring Chains of Classifiers in Real-Time Stream Mining Systems

    Directory of Open Access Journals (Sweden)

    Brian Foo

    2009-01-01

    Full Text Available Networks of classifiers can offer improved accuracy and scalability over single classifiers by utilizing distributed processing resources and analytics. However, they also pose a unique combination of challenges. First, classifiers may be located across different sites that are willing to cooperate to provide services, but are unwilling to reveal proprietary information about their analytics, or are unable to exchange their analytics due to the high transmission overheads involved. Furthermore, processing of voluminous stream data across sites often requires load shedding approaches, which can lead to suboptimal classification performance. Finally, real stream mining systems often exhibit dynamic behavior and thus necessitate frequent reconfiguration of classifier elements to ensure acceptable end-to-end performance and delay under resource constraints. Under such informational constraints, resource constraints, and unpredictable dynamics, utilizing a single, fixed algorithm for reconfiguring classifiers can often lead to poor performance. In this paper, we propose a new optimization framework aimed at developing rules for choosing algorithms to reconfigure the classifier system under such conditions. We provide an adaptive, Markov model-based solution for learning the optimal rule when stream dynamics are initially unknown. Furthermore, we discuss how rules can be decomposed across multiple sites and propose a method for evolving new rules from a set of existing rules. Simulation results are presented for a speech classification system to highlight the advantages of using the rules-based framework to cope with stream dynamics.

  16. Application of a rule-based model to estimate mercury exchange for three background biomes in the continental United States.

    Science.gov (United States)

    Hartman, Jelena S; Weisberg, Peter J; Pillai, Rekha; Ericksen, Jody A; Kuiken, Todd; Lindberg, Steve E; Zhang, Hong; Rytuba, James J; Gustin, Mae S

    2009-07-01

    Ecosystems that have low mercury (Hg) concentrations (i.e., not enriched or impacted by geologic or anthropogenic processes) cover most of the terrestrial surface area of the earth yet their role as a net source or sink for atmospheric Hg is uncertain. Here we use empirical data to develop a rule-based model implemented within a geographic information system framework to estimate the spatial and temporal patterns of Hg flux for semiarid deserts, grasslands, and deciduous forests representing 45% of the continental United States. This exercise provides an indication of whether these ecosystems are a net source or sink for atmospheric Hg as well as a basis for recommendation of data to collect in future field sampling campaigns. Results indicated that soil alone was a small net source of atmospheric Hg and that emitted Hg could be accounted for based on Hg input by wet deposition. When foliar assimilation and wet deposition are added to the area estimate of soil Hg flux these biomes are a sink for atmospheric Hg.

  17. The long and the short of it: rule-based relative length discrimination in carrion crows, Corvus corone.

    Science.gov (United States)

    Moll, Felix W; Nieder, Andreas

    2014-09-01

    Birds and other nonhuman animals can choose the larger of two discrete or continuous quantities. However, whether birds possess the conceptual grasp and cognitive control to flexibly switch between relative more-or-less-than judgments remains elusive. We therefore tested carrion crows in a rule-based line-length discrimination task to flexibly select lines presented on a touchscreen according to their relative length. In the first experiment, the crows needed to discriminate a shorter from a longer line, and vice versa. In the second experiment, the crows were required to choose a medium long line among three lines of different length (intermediate-size task). The crows switched effortlessly between "longer than/shorter than" rules, showing no signs of trial history affecting switching performance. They reliably chose the relatively longer and shorter line length, thus demonstrating a concept of greater than/less than with a continuous magnitude. However, both crows failed to discriminate a line of 'medium' length embedded in longer and shorter lines. These results indicate that relational discrimination exhibits different cognitive demands. While a greater than/less than concept requires only one relational comparison (with the respectively greater or smaller magnitude), the discrimination of a 'medium' magnitude demands to relate two or more comparisons, which might overburden crows and maybe animals in general. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Predicting a Containership's Arrival Punctuality in Liner Operations by Using a Fuzzy Rule-Based Bayesian Network (FRBBN

    Directory of Open Access Journals (Sweden)

    Nurul Haqimin Mohd Salleh

    2017-07-01

    Full Text Available One of the biggest concerns in liner operations is punctuality of containerships. Managing the time factor has become a crucial issue in today's liner shipping operations. A statistic in 2015 showed that the overall punctuality for containerships only reached an on-time performance of 73%. However, vessel punctuality is affected by many factors such as the port and vessel conditions and knock-on effects of delays. As a result, this paper develops a model for analyzing and predicting the arrival punctuality of a liner vessel at ports of call under uncertain environments by using a hybrid decision-making technique, the Fuzzy Rule-Based Bayesian Network (FRBBN. In order to ensure the practicability of the model, two container vessels have been tested by using the proposed model. The results have shown that the differences between prediction values and real arrival times are only 4.2% and 6.6%, which can be considered as reasonable. This model is capable of helping liner shipping operators (LSOs to predict the arrival punctuality of their vessel at a particular port of call.

  19. Category Number Impacts Rule-Based "and" Information-Integration Category Learning: A Reassessment of Evidence for Dissociable Category-Learning Systems

    Science.gov (United States)

    Stanton, Roger D.; Nosofsky, Robert M.

    2013-01-01

    Researchers have proposed that an explicit reasoning system is responsible for learning rule-based category structures and that a separate implicit, procedural-learning system is responsible for learning information-integration category structures. As evidence for this multiple-system hypothesis, researchers report a dissociation based on…

  20. Accounting standards and earnings management : The role of rules-based and principles-based accounting standards and incentives on accounting and transaction decisions

    NARCIS (Netherlands)

    Beest, van F.

    2012-01-01

    This book examines the effect that rules-based and principles-based accounting standards have on the level and nature of earnings management decisions. A cherry picking experiment is conducted to test the hypothesis that a substitution effect is expected from accounting decisions to transaction deci

  1. Accounting standards and earnings management : The role of rules-based and principles-based accounting standards and incentives on accounting and transaction decisions

    NARCIS (Netherlands)

    Beest, van F.

    2012-01-01

    This book examines the effect that rules-based and principles-based accounting standards have on the level and nature of earnings management decisions. A cherry picking experiment is conducted to test the hypothesis that a substitution effect is expected from accounting decisions to transaction

  2. NOVEL RULE-BASED STATIC AND DYNAMIC FEATURE EXTRACTION FROM FIGURE COPYING TASKS FOR THE DETECTION OF VISUO-SPATIAL NEGLECT

    NARCIS (Netherlands)

    Guest, R.M.; Fairhurst, M.C.; Potter, J.M.; Donelly, N.

    2004-01-01

    A series of static rule-based assessment criteria and dynamic constructional features are defined and used to analyse the hand-drawn responses from a geometric figure copying task. Assessment subjectivity is removed by the algorithmic definition of analysis criteria and test diagnostic sensitivity t

  3. Research Methodology

    CERN Document Server

    Rajasekar, S; Philomination, P

    2006-01-01

    In this manuscript various components of research are listed and briefly discussed. The topics considered in this write-up cover a part of the research methodology paper of Master of Philosophy (M.Phil.) course and Doctor of Philosophy (Ph.D.) course. The manuscript is intended for students and research scholars of science subjects such as mathematics, physics, chemistry, statistics, biology and computer science. Various stages of research are discussed in detail. Special care has been taken to motivate the young researchers to take up challenging problems. Ten assignment works are given. For the benefit of young researchers a short interview with three eminent scientists is included at the end of the manuscript.

  4. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  5. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory

  6. Estimation of Tree Cover in an Agricultural Parkland of Senegal Using Rule-Based Regression Tree Modeling

    Directory of Open Access Journals (Sweden)

    Stefanie M. Herrmann

    2013-10-01

    Full Text Available Field trees are an integral part of the farmed parkland landscape in West Africa and provide multiple benefits to the local environment and livelihoods. While field trees have received increasing interest in the context of strengthening resilience to climate variability and change, the actual extent of farmed parkland and spatial patterns of tree cover are largely unknown. We used the rule-based predictive modeling tool Cubist® to estimate field tree cover in the west-central agricultural region of Senegal. A collection of rules and associated multiple linear regression models was constructed from (1 a reference dataset of percent tree cover derived from very high spatial resolution data (2 m Orbview as the dependent variable, and (2 ten years of 10-day 250 m Moderate Resolution Imaging Spectrometer (MODIS Normalized Difference Vegetation Index (NDVI composites and derived phenological metrics as independent variables. Correlation coefficients between modeled and reference percent tree cover of 0.88 and 0.77 were achieved for training and validation data respectively, with absolute mean errors of 1.07 and 1.03 percent tree cover. The resulting map shows a west-east gradient from high tree cover in the peri-urban areas of horticulture and arboriculture to low tree cover in the more sparsely populated eastern part of the study area. A comparison of current (2000s tree cover along this gradient with historic cover as seen on Corona images reveals dynamics of change but also areas of remarkable stability of field tree cover since 1968. The proposed modeling approach can help to identify locations of high and low tree cover in dryland environments and guide ground studies and management interventions aimed at promoting the integration of field trees in agricultural systems.

  7. A noninvasive method for coronary artery diseases diagnosis using a clinically-interpretable fuzzy rule-based system

    Directory of Open Access Journals (Sweden)

    Hamid Reza Marateb

    2015-01-01

    Full Text Available Background: Coronary heart diseases/coronary artery diseases (CHDs/CAD, the most common form of cardiovascular disease (CVD, are a major cause for death and disability in developing/developed countries. CAD risk factors could be detected by physicians to prevent the CAD occurrence in the near future. Invasive coronary angiography, a current diagnosis method, is costly and associated with morbidity and mortality in CAD patients. The aim of this study was to design a computer-based noninvasive CAD diagnosis system with clinically interpretable rules. Materials and Methods: In this study, the Cleveland CAD dataset from the University of California UCI (Irvine was used. The interval-scale variables were discretized, with cut points taken from the literature. A fuzzy rule-based system was then formulated based on a neuro-fuzzy classifier (NFC whose learning procedure was speeded up by the scaled conjugate gradient algorithm. Two feature selection (FS methods, multiple logistic regression (MLR and sequential FS, were used to reduce the required attributes. The performance of the NFC (without/with FS was then assessed in a hold-out validation framework. Further cross-validation was performed on the best classifier. Results: In this dataset, 16 complete attributes along with the binary CHD diagnosis (gold standard for 272 subjects (68% male were analyzed. MLR + NFC showed the best performance. Its overall sensitivity, specificity, accuracy, type I error (α and statistical power were 79%, 89%, 84%, 0.1 and 79%, respectively. The selected features were "age and ST/heart rate slope categories," "exercise-induced angina status," fluoroscopy, and thallium-201 stress scintigraphy results. Conclusion: The proposed method showed "substantial agreement" with the gold standard. This algorithm is thus, a promising tool for screening CAD patients.

  8. Development & optimization of a rule-based energy management strategy for fuel economy improvement in hybrid electric vehicles

    Science.gov (United States)

    Asfoor, Mostafa

    The gradual decline of oil reserves and the increasing demand for energy over the past decades has resulted in automotive manufacturers seeking alternative solutions to reduce the dependency on fossil-based fuels for transportation. A viable technology that enables significant improvements in the overall energy conversion efficiencies is the hybridization of conventional vehicle drive systems. This dissertation builds on prior hybrid powertrain development at the University of Idaho. Advanced vehicle models of a passenger car with a conventional powertrain and three different hybrid powertrain layouts were created using GT-Suite. These different powertrain models were validated against a variety of standard driving cycles. The overall fuel economy, energy consumption, and losses were monitored, and a comprehensive energy analysis was performed to compare energy sources and sinks. The GT-Suite model was then used to predict the formula hybrid SAE vehicle performance. Inputs to this model were a numerically predicted engine performance map, an electric motor torque curve, vehicle geometry, and road load parameters derived from a roll-down test. In this case study, the vehicle had a supervisory controller that followed a rule-based energy management strategy to insure a proper power split during hybrid mode operation. The supervisory controller parameters were optimized using discrete grid optimization method that minimized the total amount of fuel consumed during a specific urban driving cycle with an average speed of approximately 30 [mph]. More than a 15% increase in fuel economy was achieved by adding supervisory control and managing power split. The vehicle configuration without the supervisory controller displayed a fuel economy of 25 [mpg]. With the supervisory controller this rose to 29 [mpg]. Wider applications of this research include hybrid vehicle controller designs that can extend the range and survivability of military combat platforms. Furthermore, the

  9. New Rule-Based Algorithm for Real-Time Detecting Sleep Apnea and Hypopnea Events Using a Nasal Pressure Signal.

    Science.gov (United States)

    Lee, Hyoki; Park, Jonguk; Kim, Hojoong; Lee, Kyoung-Joung

    2016-12-01

    We developed a rule-based algorithm for automatic real-time detection of sleep apnea and hypopnea events using a nasal pressure signal. Our basic premise was that the performance of our new algorithm using the nasal pressure signal would be comparable to that using other sensors as well as manual annotation labeled by a technician on polysomnography study. We investigated fifty patients with sleep apnea-hypopnea syndrome (age: 56.8 ± 10.5 years, apnea-hypopnea index (AHI): 36.2 ± 18.1/h) during full night PSG recordings at the sleep center. The algorithm was comprised of pre-processing with a median filter, amplitude computation and apnea-hypopnea detection parts. We evaluated the performance of the algorithm a confusion matric for each event and statistical analyses for AHI. Our evaluation achieved a good performance, with a sensitivity of 86.4 %, and a positive predictive value of 84.5 % for detection of apnea and hypopnea regardless of AHI severity. Our results indicated a high correlation with the manually labeled apnea-hypopnea events during PSG, with a correlation coefficient of r = 0.94 (p mean difference of -2.9 ± 11.6 per hour. The proposed new algorithm could provide significant clinical and computational insights to design a PSG analysis system and a continuous positive airway pressure (CPAP) device for screening sleep quality related in patients with sleep apnea-hypopnea syndrome.

  10. FAST密云模型馈源支撑控制系统的仿真研究%Simulation Model of Feed Support System for FAST Down Scale Model in Miyun Observation Station

    Institute of Scientific and Technical Information of China (English)

    丁钰; 朱丽春; 景奉水

    2013-01-01

    针对500 m口径球面射电望远镜(Five-hundred-meter Aperture Spherical Telescope) 1∶15密云缩尺模型馈源支撑系统进行仿真建模.分析密云模型馈源支撑控制系统的结构组成和控制方式,并在MATLAB、Simulink环境中建立仿真模型.仿真模型主要包括三个模块:弹簧阻尼器模块、A-B转台和Stewart平台,这些平台通过PID控制器进行控制.密云模型和仿真模型运行同一个天文轨迹,通过分析运行结果显示仿真模型控制性能与密云实际模型相似并且满足望远镜的指向精度要求.%A simulation model of the feed support system for the 1 ∶ 15 down-scaled model of the Five-hundredmeter Aperture Spherical Telescope (FAST) in Miyun Observation Station is proposed.The structure of the feed support control system is illustrated.The simulation model is built in MATLAB,Simulink with three modules:the spring and damper module,A-B rotator and Stewart platform which are all controlled by PID controllers.Both the Miyun model and the simulation model run the same astronomical observation track.After analyzing the output data of these two models,the results show that the performance of the simulation model is similar to the Miyun model and satisfies the pointing accuracy requirement.

  11. Prediction of ground water quality index to assess suitability for drinking purposes using fuzzy rule-based approach

    Science.gov (United States)

    Gorai, A. K.; Hasni, S. A.; Iqbal, Jawed

    2016-11-01

    Groundwater is the most important natural resource for drinking water to many people around the world, especially in rural areas where the supply of treated water is not available. Drinking water resources cannot be optimally used and sustained unless the quality of water is properly assessed. To this end, an attempt has been made to develop a suitable methodology for the assessment of drinking water quality on the basis of 11 physico-chemical parameters. The present study aims to select the fuzzy aggregation approach for estimation of the water quality index of a sample to check the suitability for drinking purposes. Based on expert's opinion and author's judgement, 11 water quality (pollutant) variables (Alkalinity, Dissolved Solids (DS), Hardness, pH, Ca, Mg, Fe, Fluoride, As, Sulphate, Nitrates) are selected for the quality assessment. The output results of proposed methodology are compared with the output obtained from widely used deterministic method (weighted arithmetic mean aggregation) for the suitability of the developed methodology.

  12. Knowledge-based systems as decision support tools in an ecosystem approach to fisheries: Comparing a fuzzy-logic and rule-based approach

    DEFF Research Database (Denmark)

    Jarre, Astrid; Paterson, B.; Moloney, C.L.

    2008-01-01

    In an ecosystem approach to fisheries (EAF), management must draw on information of widely different types, and information addressing various scales. Knowledge-based systems assist in the decision-making process by summarising this information in a logical, transparent and reproducible way. Both...... decision support tools in our evaluation of the two approaches. With respect to the model objectives, no method clearly outperformed the other. The advantages of numerically processing continuous variables, and interpreting the final output. as in fuzzy-logic models, can be weighed up against...... the advantages of using a few, qualitative, easy-to-understand categories as in rule-based models. The natural language used in rule-based implementations is easily understood by, and communicated among, users of these systems. Users unfamiliar with fuzzy-set theory must "trust" the logic of the model. Graphical...

  13. Analysis of Aircraft Control Performance using a Fuzzy Rule Base Representation of the Cooper-Harper Aircraft Handling Quality Rating

    Science.gov (United States)

    Tseng, Chris; Gupta, Pramod; Schumann, Johann

    2006-01-01

    The Cooper-Harper rating of Aircraft Handling Qualities has been adopted as a standard for measuring the performance of aircraft since it was introduced in 1966. Aircraft performance, ability to control the aircraft, and the degree of pilot compensation needed are three major key factors used in deciding the aircraft handling qualities in the Cooper- Harper rating. We formulate the Cooper-Harper rating scheme as a fuzzy rule-based system and use it to analyze the effectiveness of the aircraft controller. The automatic estimate of the system-level handling quality provides valuable up-to-date information for diagnostics and vehicle health management. Analyzing the performance of a controller requires a set of concise design requirements and performance criteria. Ir, the case of control systems fm a piloted aircraft, generally applicable quantitative design criteria are difficult to obtain. The reason for this is that the ultimate evaluation of a human-operated control system is necessarily subjective and, with aircraft, the pilot evaluates the aircraft in different ways depending on the type of the aircraft and the phase of flight. In most aerospace applications (e.g., for flight control systems), performance assessment is carried out in terms of handling qualities. Handling qualities may be defined as those dynamic and static properties of a vehicle that permit the pilot to fully exploit its performance in a variety of missions and roles. Traditionally, handling quality is measured using the Cooper-Harper rating and done subjectively by the human pilot. In this work, we have formulated the rules of the Cooper-Harper rating scheme as fuzzy rules with performance, control, and compensation as the antecedents, and pilot rating as the consequent. Appropriate direct measurements on the controller are related to the fuzzy Cooper-Harper rating system: a stability measurement like the rate of change of the cost function can be used as an indicator if the aircraft is under

  14. Towards computerizing intensive care sedation guidelines: design of a rule-based architecture for automated execution of clinical guidelines

    Directory of Open Access Journals (Sweden)

    Kerckhove Wannes

    2010-01-01

    Full Text Available Abstract Background Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs. The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase. Methods A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA. Results The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows

  15. Development of Near Optimal Rule-Based Control for Plug-In Hybrid Electric Vehicles Taking into Account Drivetrain Component Losses

    OpenAIRE

    Hanho Son; Hyunsoo Kim

    2016-01-01

    A near-optimal rule-based mode control (RBC) strategy was proposed for a target plug-in hybrid electric vehicle (PHEV) taking into account the drivetrain losses. Individual loss models were developed for drivetrain components including the gears, planetary gear (PG), bearings, and oil pump, based on experimental data and mathematical governing equations. Also, a loss model for the power electronic system was constructed, including loss from the motor-generator while rotating in the unloaded s...

  16. Identification of organization name variants in large databases using rule-based scoring and clustering: With a case study on the web of science database

    OpenAIRE

    Caron, Emiel; Daniels, Hennie

    2016-01-01

    textabstractThis research describes a general method to automatically clean organizational and business names variants within large databases, such as: patent databases, bibliographic databases, databases in business information systems, or any other database containing organisational name variants. The method clusters name variants of organizations based on similarities of their associated meta-data, like, for example, postal code and email domain data. The method is divided into a rule-base...

  17. Transfer between local and global processing levels by pigeons (Columba livia) and humans (Homo sapiens) in exemplar- and rule-based categorization tasks.

    Science.gov (United States)

    Aust, Ulrike; Braunöder, Elisabeth

    2015-02-01

    The present experiment investigated pigeons' and humans' processing styles-local or global-in an exemplar-based visual categorization task in which category membership of every stimulus had to be learned individually, and in a rule-based task in which category membership was defined by a perceptual rule. Group Intact was trained with the original pictures (providing both intact local and global information), Group Scrambled was trained with scrambled versions of the same pictures (impairing global information), and Group Blurred was trained with blurred versions (impairing local information). Subsequently, all subjects were tested for transfer to the 2 untrained presentation modes. Humans outperformed pigeons regarding learning speed and accuracy as well as transfer performance and showed good learning irrespective of group assignment, whereas the pigeons of Group Blurred needed longer to learn the training tasks than the pigeons of Groups Intact and Scrambled. Also, whereas humans generalized equally well to any novel presentation mode, pigeons' transfer from and to blurred stimuli was impaired. Both species showed faster learning and, for the most part, better transfer in the rule-based than in the exemplar-based task, but there was no evidence of the used processing mode depending on the type of task (exemplar- or rule-based). Whereas pigeons relied on local information throughout, humans did not show a preference for either processing level. Additional tests with grayscale versions of the training stimuli, with versions that were both blurred and scrambled, and with novel instances of the rule-based task confirmed and further extended these findings.

  18. Model-based interpretation of the ECG: a methodology for temporal and spatial reasoning.

    OpenAIRE

    Tong, D. A.; Widman, L. E.

    1992-01-01

    A new software architecture for automatic interpretation of the electrocardiogram is presented. Using the hypothesize-and-test paradigm, a semi-quantitative physiological model and production rule-based knowledge are combined to reason about time- and space-varying characteristics of complex heart rhythms. A prototype system implementing the methodology accepts a semi-quantitative description of the onset and morphology of the P waves and QRS complexes that are observed in the body-surface el...

  19. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    Science.gov (United States)

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.

  20. The performance of model-based versus rule-based phase I clinical trials in oncology : A quantitative comparison of the performance of model-based versus rule-based phase I trials with molecularly targeted anticancer drugs over the last 2 years

    NARCIS (Netherlands)

    van Brummelen, E M J; Huitema, A D R; van Werkhoven, E; Beijnen, J H; Schellens, J H M

    2016-01-01

    Phase I studies with anticancer drugs are used to evaluate safety and tolerability and to choose a recommended phase II dose (RP2D). Traditionally, phase I trial designs are rule-based, but for several years there is a trend towards model-based designs. Simulations have shown that model-based design

  1. Business rule-based individual tax return system%基于业务规则的个人纳税申报系统

    Institute of Scientific and Technical Information of China (English)

    王伟辉; 耿国华; 周明全

    2011-01-01

    设计了一种基于规则的纳税申报系统的业务领域模型,并介绍了基于此模型构建的纳税申报系统的技术架构、组成部件和工作机制.纳税业务规则集的存储和管理彻底与程序逻辑分离,提高了纳税业务管理的灵活性.%Introduced the business rule management techniques to the individual tax return systems to achieve business agility of the systems via splitting the tax return business rules logic being changed frequently from the application logic and managing the tax return rules separately. Gave and depicted a business rule-based tax return business domain model. Stated the system architecture, the components and the working mechanism of a rule-based individual tax return system built on the new domain model. The tax return business rules were stored and managed separately from application logic in the system. It improved the flexibility of tax return business rules management.

  2. Category number impacts rule-based and information-integration category learning: a reassessment of evidence for dissociable category-learning systems.

    Science.gov (United States)

    Stanton, Roger D; Nosofsky, Robert M

    2013-07-01

    Researchers have proposed that an explicit reasoning system is responsible for learning rule-based category structures and that a separate implicit, procedural-learning system is responsible for learning information-integration category structures. As evidence for this multiple-system hypothesis, researchers report a dissociation based on category-number manipulations in which rule-based category learning is worse when the category is composed of 4, rather than 2, response categories; however, information-integration category learning is unaffected by category-number manipulations. We argue that within the reported category-number manipulations, there exists a critical confound: Perceptual clusters used to construct the categories are spread apart in the 4-category condition relative to the 2-category one. The present research shows that when this confound is eliminated, performance on information-integration category learning is worse for 4 categories than for 2 categories, and this finding is demonstrated across 2 different information-integration category structures. Furthermore, model-based analyses indicate that a single-system learning model accounts well for both the original findings and the updated experimental findings reported here.

  3. Spatial rule-based assessment of habitat potential to predict impact of land use changes on biodiversity at municipal scale.

    Science.gov (United States)

    Scolozzi, Rocco; Geneletti, Davide

    2011-03-01

    In human dominated landscapes, ecosystems are under increasing pressures caused by urbanization and infrastructure development. In Alpine valleys remnant natural areas are increasingly affected by habitat fragmentation and loss. In these contexts, there is a growing risk of local extinction for wildlife populations; hence assessing the consequences on biodiversity of proposed land use changes is extremely important. The article presents a methodology to assess the impacts of land use changes on target species at a local scale. The approach relies on the application of ecological profiles of target species for habitat potential (HP) assessment, using high resolution GIS-data within a multiple level framework. The HP, in this framework, is based on a species-specific assessment of the suitability of a site, as well of surrounding areas. This assessment is performed through spatial rules, structured as sets of queries on landscape objects. We show that by considering spatial dependencies in habitat assessment it is possible to perform better quantification of impacts of local-level land use changes on habitats.

  4. CRUDE OIL PRICE FORECASTING WITH TEI@I METHODOLOGY

    Institute of Scientific and Technical Information of China (English)

    WANG Shouyang; YU Lean; K.K.LAI

    2005-01-01

    The difficulty in crude oil price forecasting,due to inherent complexity,has attracted much attention of academic researchers and business practitioners.Various methods have been tried to solve the problem of forecasting crude oil prices.However,all of the existing models of prediction can not meet practical needs.Very recently,Wang and Yu proposed a new methodology for handling complex systems-TEI@I methodology by means of a systematic integration of text mining,econometrics and intelligent techniques.Within the framework of TEI@I methodology,econometrical models are used to model the linear components of crude oil price time series (i.e.,main trends) while nonlinear components of crude oil price time series (i.e.,error terms) are modelled by using artificial neural network (ANN) models.In addition,the impact of irregular and infrequent future events on crude oil price is explored using web-based text mining (WTM) and rule-based expert systems (RES) techniques.Thus,a fully novel nonlinear integrated forecasting approach with error correction and judgmental adjustment is formulated to improve prediction performance within the framework of the TEI@I methodology.The proposed methodology and the novel forecasting approach are illustrated via an example.

  5. THE AGILE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Charul Deewan

    2012-09-01

    Full Text Available The technologies are numerous and Software is the one whichis most widely used. Some companies have their owncustomized methodology for developing their software but themajority speaks about two kinds of methodologies: Traditionaland Agile methodologies. In this paper, we will discuss someof the aspects of what Agile methodology is, how it can beused to get the best result from a project, how do we get it towork in an organization.

  6. Language Policy and Methodology

    Science.gov (United States)

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit…

  7. Simulation of occupant evacuation using Building GIS and behavioral rule base%基于Building GIS与行为规则库的人员疏散模拟

    Institute of Scientific and Technical Information of China (English)

    唐方勤; 任爱珠; 徐峰

    2011-01-01

    为有效描述要素空间分布,结合面向建筑的地理信息系统(Building GIS)与行为规则库,构建了人员疏散行为模型。通过CAD数据的语义解析,描述建筑空间单元连通性等特征。应用GIS技术构建疏散网络,实时分析建筑场景内要素分布特征及行为规律。在此基础上为智能体设计行为规则库,描述疏散中人员的行为意图与策略。结合应用实例,与实测数据和building EXODUS模拟结果进行了对比验证。结果表明:该模型能够表现建筑场景的人员疏散进程,反映出口疏散效率随时间的变化规律。%Spatial distributions of variables were properly described based on a developed behavioral model for evacuation using Building GIS and behavioral rule base with CAD data parsed to capture building features,including the internal connectivity.An evacuation network was established based on GIS to analyze variable distributions and behaviors in real time.A behavioral rule base was then designed accordingly to specify intensions and strategies for agents.A case study was conducted to validate the model by comparing with the buildingEXODUS simulation results and the experimental data.The result demonstrates that the model can simulate the evacuation process in buildings and evaluate the exit evacuation efficiency varying with time.

  8. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  9. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  10. LANGUAGE POLICY AND METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Antony J. Liddicoat

    2004-06-01

    Full Text Available The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit recommendations about the methods to be used in classroom practice, or indirectly, through the conceptualisation of language leaming which underlies the policy. It can be argued that all language policies have the potential to influence teaching methodologies indirectly and that those policies which have explicit recommendations about methodology are actually functioning of two levels. This allows for the possibility of conflict between the direct and indirect dimensions of the policy which results from an inconsistency between the explicitly recommended methodology and the underlying conceptualisation of language teaching and learning which informs the policy.

  11. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  12. 置信规则库规则约简的粗糙集方法%Rough set method for rule reduction in belief rule base

    Institute of Scientific and Technical Information of China (English)

    王应明; 杨隆浩; 常雷雷; 傅仰耿

    2014-01-01

    The number of rules in belief rule base(BRB) may induce the problem of combinatorial explosion. However, most previous works on rule reduction are based on feature extraction, whose effectiveness depends on the expert knowledge. Therefore, an objective method based on rough set theory is proposed, which does not depends on any knowledge in addition to belief rule base. The method of rule reduction analyzes each belief rule according to equivalence class division thought, and then eliminates the redundancy of referential values. Finally, a numerical case study to evaluate armored system is analyzed and compared with the typical subjective method in the number of reduced rule and the accuracy of decision-making. The results show that the proposed method is feasible and effective, and superior to the existing subjective method of rule reduction.%针对置信规则中规则数的“组合爆炸”问题,目前的解决方法主要是基于特征提取的规则约简方法,有效性依赖于专家知识。鉴于此,提出基于粗糙集理论的无需依赖规则库以外知识的客观方法,按照等价类划分思想逐条分析置信规则,进而消除冗余的候选值。最后,以装甲装备能力评估作为实例进行分析,分别从规则约简数、决策准确性方面与具有代表性的主观方法进行对比,结果表明,所提出方法是有效可行的,且优于现有规则约简主观方法。

  13. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  14. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss...

  15. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  16. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  17. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  18. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  19. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works as a b...

  20. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  1. Rapid Dialogue Prototyping Methodology

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Sojka, P.; Rajman, M.; Kopecek, I.; Melichar, M.; Pala, K.

    2004-01-01

    This paper is about the automated production of dialogue models. The goal is to propose and validate a methodology that allows the production of finalized dialogue models (i.e. dialogue models specific for given applications) in a few hours. The solution we propose for such a methodology, called the

  2. Control of Flexible Joint Manipulator via Variable Structure Rule-Based Fuzzy Control and Chaos Anti-Control with Experimental Validation

    Directory of Open Access Journals (Sweden)

    Mojtaba Rostami Kandroodi

    2014-03-01

    Full Text Available This paper presents a variable structure rule-based fuzzy control for trajectory tracking and vibration control of a flexible joint manipulator by using chaotic anti-control. Based on Lyapunov stability theory for variable structure control and fuzzy rules, the nonlinear controller and some generic sufficient conditions for global asymptotic control are attained. The fuzzy rules are directly constructed subject to a Lyapunov function obtained from variable structure surfaces such that the error dynamics of control problem satisfy stability in the Lyapunov sense. Also in this study, the anti-control is applied to reduce the deflection angle of flexible joint system. To achieve this goal, the chaos dynamic must be created in the flexible joint system. So, the flexible joint system has been synchronized to chaotic gyroscope system. In this study, control and anti-control concepts are applied to achieve the high quality performance of flexible joint system. It is tried to design a controller which is capable to satisfy the control and anti- control aims. The performances of the proposed control are examined in terms of input tracking capability, level of vibration reduction and time response specifications. Finally, the efficacy of the proposed method is validated through experimentation on QUANSER’s flexible-joint manipulator.

  3. 基于规则生成的ERP智能零件编码器%The Rule-based Generated Intelligent Components Encoder of ERP

    Institute of Scientific and Technical Information of China (English)

    白俊; 龙伟; 黄敏

    2011-01-01

    In order to achiew the integration of CPD and CAX / PDM / ERP, manufacturing process requires the components coding system based on product family as the basic core module. Aiming at the ERP system of oil drilling equipment, it proposcs a reference model of rule - based components coding and designs a rulebased generated intelligent components encoder. The encoder can be applied to the ERP system of different coding rules and generate unique components coding%以产品族为基础的零件编码系统,是制造业实现CPD与CAX/PDM/ERP有机集成的基础性核心模块.针对石油钻井成套设备的ERP系统,提出了一个基于规则生成的零件编码的参考模型,并设计构造了基于规则生成的智能零件编码器.该编码器可应用于规则定制不同的ERP系统中,生成具有惟一性的零件编码.

  4. Expert System Shells for Rapid Clinical Decision Support Module Development: An ESTA Demonstration of a Simple Rule-Based System for the Diagnosis of Vaginal Discharge.

    Science.gov (United States)

    Kamel Boulos, Maged N

    2012-12-01

    This study demonstrates the feasibility of using expert system shells for rapid clinical decision support module development. A readily available expert system shell was used to build a simple rule-based system for the crude diagnosis of vaginal discharge. Pictures and 'canned text explanations' are extensively used throughout the program to enhance its intuitiveness and educational dimension. All the steps involved in developing the system are documented. The system runs under Microsoft Windows and is available as a free download at http://healthcybermap.org/vagdisch.zip (the distribution archive includes both the program's executable and the commented knowledge base source as a text document). The limitations of the demonstration system, such as the lack of provisions for assessing uncertainty or various degrees of severity of a sign or symptom, are discussed in detail. Ways of improving the system, such as porting it to the Web and packaging it as an app for smartphones and tablets, are also presented. An easy-to-use expert system shell enables clinicians to rapidly become their own 'knowledge engineers' and develop concise evidence-based decision support modules of simple to moderate complexity, targeting clinical practitioners, medical and nursing students, as well as patients, their lay carers and the general public (where appropriate). In the spirit of the social Web, it is hoped that an online repository can be created to peer review, share and re-use knowledge base modules covering various clinical problems and algorithms, as a service to the clinical community.

  5. Design and Implementation of Simple Rule Base for Data Cleaning%简单数据清洗规则库的设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈华; 饶慧

    2012-01-01

    数据清洗可以提高数据质量。介绍了数据清洗规则库设计方法,该方法基于编程语言的反射技术和Python脚本技术实现数据清洗,可以方便、灵活且高效地解决数据清洗中错误数据的问题,具有扩展性强、易于实现和开发成本低等特点。%The data quality can be improved by data cleaning. The method for designing the sim- ple rule base with data cleaning is introduced. The reflection technology based on the program- ming language and the Python scripts technology are used to realize the data cleaning. By using these technologies, the problems of erroneous data in the data cleaning process can be solved. The method has features of strong expansibility, easy realization and low development cost.

  6. Usage-based vs. rule-based learning: the acquisition of word order in wh-questions in English and Norwegian.

    Science.gov (United States)

    Westergaard, Marit

    2009-11-01

    This paper discusses different approaches to language acquisition in relation to children's acquisition of word order in wh-questions in English and Norwegian. While generative models assert that children set major word order parameters and thus acquire a rule of subject-auxiliary inversion or generalized verb second (V2) at an early stage, some constructivist work argues that English-speaking children are simply reproducing frequent wh-word+auxiliary combinations in the input. The paper questions both approaches, re-evaluates some previous work, and provides some further data, concluding that the acquisition of wh-questions must be the result of a rule-based process. Based on variation in adult grammars, a cue-based model to language acquisition is presented, according to which children are sensitive to minor cues in the input, called micro-cues. V2 is not considered to be one major parameter, but several smaller-scale cues, which are responsible for children's lack of syntactic (over-)generalization in the acquisition process.

  7. A Rule-Based Energy Management Strategy for a Plug-in Hybrid School Bus Based on a Controller Area Network Bus

    Directory of Open Access Journals (Sweden)

    Jiankun Peng

    2015-06-01

    Full Text Available This paper presents a rule-based energy management strategy for a plug-in hybrid school bus (PHSB. In order to verify the effectiveness and rationality of the proposed energy management strategy, the powertrain and control models were built with MATLAB/Simulink. The PHSB powertrain model includes an engine model, ISG (integrated started and generator model, drive motor model, power battery packs model, driver model, and vehicle longitudinal dynamics model. To evaluate the controller area network (CAN bus performance features such as the bus load, signal hysteresis, and to verify the reliability and real-time performance of the CAN bus multi-node control method, a co-simulation platform was built with CANoe and MATLAB/Simulink. The co-simulation results show that the control strategy can meet the requirements of the PHSB’s dynamic performance. Meanwhile, the charge-depleting mode (CD and charge-sustaining mode (CS can switch between each other and maintain a state-of-charge (SoC of around 30%, indicating that the energy management strategy effectively extends the working period of the CD mode and improves the fuel economy further. The energy consumption per 100 km includes 13.7 L diesel and 10.5 kW·h electricity with an initial SoC of 75%. The CANoe simulation results show that the bus communication performs well without error frames.

  8. 基于信度优势的比例冲突再分配规则%Proportional Conflict Redistribution Rules Based on Preponderant Belief

    Institute of Scientific and Technical Information of China (English)

    金宏斌; 蓝江桥

    2011-01-01

    Although the proportional conflict redistribution rule is an effective method of high conflict evidence combination, its processing idea is conservative for exploiting the belief of evidence insufficiently. Therefore, an improved rule based on preponderant belief is proposed. A novel belief function weighted for the initial distributed proportion is constructed. Thus, the ratio of two belief is increased, and the weighting of the dominant proposition is enhanced, which obviously represents the preponderant belief of evidence. The effectiveness is demonstrated by the experimental results in the final.%比例冲突再分配规则是高冲突证据合成的一种有效方法,但其处理思想保守,没有充分利用证据的信度.为此,提出了基于信度优势的改进规则,通过构造信度函数,并作为权重对原始分配比例进行加权,增大两信度之间的比值,提高优势命题的权重,使证据的信度优势得到了明显体现.算例分析验证了该规则的有效性.

  9. Development of Near Optimal Rule-Based Control for Plug-In Hybrid Electric Vehicles Taking into Account Drivetrain Component Losses

    Directory of Open Access Journals (Sweden)

    Hanho Son

    2016-05-01

    Full Text Available A near-optimal rule-based mode control (RBC strategy was proposed for a target plug-in hybrid electric vehicle (PHEV taking into account the drivetrain losses. Individual loss models were developed for drivetrain components including the gears, planetary gear (PG, bearings, and oil pump, based on experimental data and mathematical governing equations. Also, a loss model for the power electronic system was constructed, including loss from the motor-generator while rotating in the unloaded state. To evaluate the effect of the drivetrain losses on the operating mode control strategy, backward simulations were performed using dynamic programming (DP. DP selects the operating mode, which provides the highest efficiency for given driving conditions. It was found that the operating mode selection changes when drivetrain losses are included, depending on driving conditions. An operating mode schedule was developed with respect to the wheel power and vehicle speed, and based on the operating mode schedule, a RBC was obtained, which can be implemented in an on-line application. To evaluate the performance of the RBC, a forward simulator was constructed for the target PHEV. The simulation results show near-optimal performance of the RBC compared with dynamic-programming-based mode control in terms of the mode operation time and fuel economy. The RBC developed with drivetrain losses taken into account showed a 4%–5% improvement of the fuel economy over a similar RBC, which neglected the drivetrain losses.

  10. Rule-based Model Automatic Conversion Method%基于规则的模型自动转换方法

    Institute of Scientific and Technical Information of China (English)

    杨鹤标; 石云

    2011-01-01

    为实现任务模型(TM)到抽象用户界面(AUI)模型的自动转换,提出一个基于规则的模型转换方法.构建并发任务树的TM元模型和AUI元模型,采用基于对象约束语言的规则表示方法定义TM到AUI的映射规则,使用可扩展标记语言(XML)描述TM和映射规则,得到AUI的XML文件.通过一个虚拟工作压力调查实例验证该方法的可行性和易用性.%In order to realize the automatic conversion of Task Model(TM) to Abstract User Interface(AUI), this paper presents a rule-based model automatic conversion method. The TM meta model of concur task tree and AUI meta model are constructed. The rule representation based on Object Constraint Language(OCL) is used to define the mapping rules from task model to abstract user interface. The extensible Markup Language(XML) is used to describe the task model and mapping rules. An XML file of abstract user interface is derived. An example of virtual research on job stress is given to demonstrate the feasibility and accessibility of this method.

  11. Transforming Adverse Cognition on the Path of Bhakti: Rule-Based Devotion, “My-Ness,” and the Existential Condition of Bondage

    Directory of Open Access Journals (Sweden)

    Travis Chilcott

    2016-05-01

    Full Text Available Early Gauḍīya Vaiṣṇava theologians developed a unique path of Hindu devotion during the 16th century through which an aspirant cultivates a rapturous form of selfless love (premā for Kṛṣṇa, who is recognized as the supreme and personal deity. In the course and consequence of cultivating this selfless love, the recommended practices of devotion are claimed to free one from the basic existential condition of bondage that is of concern for a wide range of South Asian religious and philosophical traditions. One of the principle cognitive tendencies characterizing this condition is to have thoughts and feelings of possessiveness over objects of the world, or what is referred to as the state of “my-ness” (mamatā, e.g., my home, my children, or my wealth. Using the therapeutic model of schema therapy as a heuristic analogue, this article explores the relationship between recommended practices of rule-based devotion (vaidhi-bhakti and the modulation of thoughts and feelings of possessiveness towards mundane objects. I argue that such practices function as learning strategies that can systematically rework and modulate how one relates to and responds to these objects in theologically desirable ways. I conclude by suggesting that connectionist theories of cognition and learning may offer a promising explanatory framework for understanding the dynamics of this kind of relationship.

  12. Using hierarchical cluster models to systematically identify groups of jobs with similar occupational questionnaire response patterns to assist rule-based expert exposure assessment in population-based studies

    NARCIS (Netherlands)

    Friesen, Melissa C; Shortreed, Susan M; Wheeler, David C; Burstyn, Igor; Vermeulen, Roel; Pronk, Anjoeka; Colt, Joanne S; Baris, Dalsu; Karagas, Margaret R; Schwenn, Molly; Johnson, Alison; Armenti, Karla R; Silverman, Debra T; Yu, Kai

    2015-01-01

    OBJECTIVES: Rule-based expert exposure assessment based on questionnaire response patterns in population-based studies improves the transparency of the decisions. The number of unique response patterns, however, can be nearly equal to the number of jobs. An expert may reduce the number of patterns t

  13. Control of Angra 1' PZR by a fuzzy rule base build through genetic programming; Controle do PZR de Angra 1 por meio de uma base de regras nebulosas construidas atraves de programacao genetica

    Energy Technology Data Exchange (ETDEWEB)

    Caldas, Gustavo Henrique Flores; Schirru, Roberto [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear

    2002-07-01

    There is an optimum pressure for the normal operation of nuclear power plant reactors and thresholds that must be respected during transients, what make the pressurizer an important control mechanism. Inside a pressurizer there are heaters and a shower. From their actuation levels, they control the vapor pressure inside the pressurizer and, consequently, inside the primary circuit. Therefore, the control of the pressurizer consists in controlling the actuation levels of the heaters and of the shower. In the present work this function is implemented through a fuzzy controller. Besides the efficient way of exerting control, this approach presents the possibility of extracting knowledge of how this control is been made. A fuzzy controller consists basically in an inference machine and a rule base, the later been constructed with specialized knowledge. In some circumstances, however, this knowledge is not accurate, and may lead to non-efficient results. With the development of artificial intelligence techniques, there wore found methods to substitute specialists, simulating its knowledge. Genetic programming is an evolutionary algorithm particularly efficient in manipulating rule base structures. In this work genetic programming was used as a substitute for the specialist. The goal is to test if an irrational object, a computer, is capable, by it self, to find out a rule base reproducing a pre-established actuation levels profile. The result is positive, with the discovery of a fuzzy rule base presenting an insignificant error. A remarkable result that proves the efficiency of the approach. (author)

  14. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...... Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers....

  15. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....

  16. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    Science.gov (United States)

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the

  17. 基于规则的递归选择性拆卸序列规划方法%Rule-based Recursive Selective Disassembly Sequence Planning

    Institute of Scientific and Technical Information of China (English)

    丁勇; 孙有朝

    2011-01-01

    Disassembly Sequence Planning plays a significant role in maintenance planning of products' maintainability design. However, complete disassembly is often not practical. Selective disassembly sequence planning focuses on disassembling only one or more selected components from a product for maintenance. This paper presents a rule-based recursive method for finding a disassembly sequence. Most prior methods enumerate all solutions or use stochastic random methods to generate solutions. On the contrary, the proposed method establishes certain heuristic disassembly rules to eliminate uncommon or unrealistic solutions. Finally, an application instance is presented of the disassembly sequence of power brake to illustrate the validity of this method.%拆卸序列规划是产品维修性设计的一个重要内容。然而完全的拆卸序列通常是不实际的。选择性拆卸序列规划方法强调从产品中选择一个或多个零件用来维护。本文提出一个基于规则的递归方法用来生成拆卸序列。大多以前的方法枚举出所有的拆卸序列或利用随机的方法生成拆卸序列。相反,本文提出的方法利用规则消除那些不可能的拆卸序列。最后,结合机动煞车给出了应用实例,验证本文方法的有效性。

  18. Nodule Detection in a Lung Region that's Segmented with Using Genetic Cellular Neural Networks and 3D Template Matching with Fuzzy Rule Based Thresholding

    Energy Technology Data Exchange (ETDEWEB)

    Ozekes, Serhat; Osman, Onur; Ucan, N. [Istanbul Commerce University, Ragip Gumuspala Cad. No: 84 34378 Eminonu, Istanbul (Turkmenistan)

    2008-02-15

    The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer aided detection of lung nodules.

  19. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters.

    Science.gov (United States)

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector) - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.

  20. A Belief K-means Clustering Algorithm for Structure Identification of Belief-rule-base%置信规则库结构识别的置信K均值聚类算法

    Institute of Scientific and Technical Information of China (English)

    李彬; 王红卫; 杨剑波; 祁超; 郭敏

    2011-01-01

    针对置信规则推理作为系统控制器时的应用,提出一种置信K均值聚类算法用于置信规则库的结构识别。在构建好置信规则库的推理框架后,该算法通过对规则前项输入变量的历史数据进行挖掘,得到合理的置信规则库结构,提高推理与决策的精度。相对于传统专家知识确定置信规则库结构的方法,该算法的特点是:最优聚类与相邻评价等级之间的距离成正比,与人的认知能力相一致;最优聚类保证采样点以最小的距离靠近评价等级,也就是保证输入变量尽可能趋近置信规则前项。通过置信规则推理在集约生产计划中应用的案例分析验证了该算法的合理性和有效性。%A belief K-means clustering algorithm is proposed to identify the structure of a belief-rule-base for belief-rule based reasoning in system control.After the inference framework of the belief-rule-base is constructed,the algorithm can generate a reasonable structure of the belief-rule base and improve inference accuracy and decision quality through mining historical data about antecedent input variables.Compared with traditional expert-knowledge based methods for determining the structure of belief-rule-base,the new algorithm has the following characteristics.The generated optimal cluster is directly proportional to the distance between two adjacent evaluation grades,which is consistent with human cognition.The optimal cluster ensures that the sampling data are around the evaluation grades with minimum distances,which ensures that input variables optimally approximate the antecedents of belief rules.A case study is conducted to apply the belief-rule based reasoning to aggregate production planning,which demonstrates the rationality and effectiveness of the proposed algorithm.

  1. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  2. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly....... The functional HAZOP methodology lends itself directly for implementation into a computer aided reasoning tool to perform root cause and consequence analysis. Such a tool can facilitate finding causes and/or consequences far away from the site of the deviation. A functional HAZOP assistant is proposed...... and investigated in a HAZOP study of an industrial scale Indirect Vapor Recompression Distillation pilot Plant (IVaRDiP) at DTU-Chemical and Biochemical Engineering. The study shows that the functional HAZOP methodology provides a very efficient paradigm for facilitating HAZOP studies and for enabling reasoning...

  3. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  4. Methodology for research I.

    Science.gov (United States)

    Garg, Rakesh

    2016-09-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.

  5. 基于规则的Web3D森林模型的轻量化构建%Rule-based Web3D forest lightweight construction.

    Institute of Scientific and Technical Information of China (English)

    戴唯艺; 杨阳; 贾金原

    2012-01-01

    Tree modeling and simulation has always been a challenging research topic in WebVR due to its complex structure and a great amount of data. Especially nowadays with the expansion of 3D virtual scene in web, the contradiction is more prominent between the real-time downloading request of mass virtual forest from client and the limiting network bandwidth. This paper proposes a rule-based method which can help to construct lightweight Web3D forest. This method is based on L-system' s self-similarity and the reuse thinking in WebVR. It can change tree/forest model files to lightweight Web3D files successfully, even only a few kB size. The virtual scene is also suitable for online real-time transmission and browsing in low web bandwidth environment. To a certain extent, the bottleneck problem is solved.%树木的建模仿真因其结构十分复杂数据量巨大一直以来都是WebVR领域中极具挑战的研究课题.尤其是目前随着网上三维虚拟场景的急剧扩张,海量WebVR虚拟森林场景在用户端的实时下载请求和有限网络带宽之间的矛盾亦变得日趋突出.提出了基于L-system的Web3D虚拟树木语森林的轻量化建模方法,基于树木L-System描述体系的自相似性和基于重用的WebVR轻量化建模思想,成功地将虚拟树木/森林场景转化为极为轻量化的Web3D文件,甚至只有十几kB大小,即便网络带宽很低的情况下,也非常适于网上的实时传输与漫游,在一定程度上解决了该瓶颈问题.

  6. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters...... control; a rule based expert system. On the other hand, compared with a regulatory control technique designed earlier in Mollerup et al. (2015), the optimisation showed similar performance with respect to minimising overflow volume. Hence for operation of small sewer systems, regulatory control strategies...... can offer promising potential and should be considered along more advanced strategies when identifying novel solutions....

  7. A methodology for exploiting the tolerance for imprecision in genetic fuzzy systems and its application to characterization of rotor blade leading edge materials

    Science.gov (United States)

    Sánchez, Luciano; Couso, Inés; Palacios, Ana M.; Palacios, José L.

    2013-05-01

    A methodology for obtaining fuzzy rule-based models from uncertain data is proposed. The granularity of the linguistic discretization is decided with the help of a new estimation of the mutual information between ill-known random variables, and a combination of boosting and genetic algorithms is used for discovering new rules. This methodology has been applied to predict whether the coating of an helicopter rotor blade is adequate, considering the shear adhesion strength of ice to different materials. The discovered knowledge is intended to increase the level of post-processing interpretation accuracy of experimental data obtained during the evaluation of ice-phobic materials for rotorcraft applications.

  8. Down-scaling LUCC based on the histo-variogram

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hao; CAO ChunXiang; LI GuoPing; YANG Hua; LI XiaoWen; QIN Jun

    2009-01-01

    In remote sensing applications, accurate extraction of land type area after classification is very impor-tant. But for images of land use/cover change (LUCC) obtained from the special spatial resolution re-mote sensing data, it will be of great significance to obtain the land type area information with higher resolution by making use of spatial distribution characteristcs information of the land type itself first and further scaling-down in a given scale threshold on the basis of the existing spatial resolution data.An explicit expression of the relationship between the measurement scale, global fractal dimension and the land type area corresponding to different measurement scales is obtained on the research basis of the authors' histo-variogram using the standardized area index (SAI). A good attempt has been made to obtain the land type area information with higher resolution by merely using the spatial distribution characteristcs information of the land type in the image itself and further scaling-down in a given scale threshold on the basis of the existing spatial resolution data.

  9. Down-scaling LUCC based on the histo-variogram

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In remote sensing applications,accurate extraction of land type area after classification is very impor-tant.But for images of land use/cover change(LUCC) obtained from the special spatial resolution re-mote sensing data,it will be of great significance to obtain the land type area information with higher resolution by making use of spatial distribution characteristcs information of the land type itself first and further scaling-down in a given scale threshold on the basis of the existing spatial resolution data.An explicit expression of the relationship between the measurement scale,global fractal dimension and the land type area corresponding to different measurement scales is obtained on the research basis of the authors’ histo-variogram using the standardized area index(SAI).A good attempt has been made to obtain the land type area information with higher resolution by merely using the spatial distribution characteristcs information of the land type in the image itself and further scaling-down in a given scale threshold on the basis of the existing spatial resolution data.

  10. The methodological cat

    Directory of Open Access Journals (Sweden)

    Marin Dinu

    2014-03-01

    Full Text Available Economics understands action as having the connotation of here and now, the proof being that it excessively uses, for explicative purposes, two limitations of sense: space is seen as the place with a private destination (through the cognitive dissonance of methodological individualism, and time is seen as the short term (through the dystopia of rational markets.

  11. Video: Modalities and Methodologies

    Science.gov (United States)

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  12. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  13. Methodology, Meditation, and Mindfulness

    Directory of Open Access Journals (Sweden)

    Balveer Singh Sikh

    2016-04-01

    Full Text Available Understanding the nondualistic nature of mindfulness is a complex and challenging task particularly when most clinical psychology draws from Western methodologies and methods. In this article, we argue that the integration of philosophical hermeneutics with Eastern philosophy and practices may provide a methodology and methods to research mindfulness practice. Mindfulness hermeneutics brings together the nondualistically aligned Western philosophies of Heidegger and Gadamer and selected Eastern philosophies and practices in an effort to bridge the gap between these differing worldviews. Based on the following: (1 fusion of horizons, (2 being in a hermeneutic circle, (3 understanding as intrinsic to awareness, and (4 the ongoing practice of meditation, a mindfulness hermeneutic approach was used to illuminate deeper understandings of mindfulness practice in ways that are congruent with its underpinning philosophies.

  14. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  15. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  16. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  17. Tobacco documents research methodology.

    Science.gov (United States)

    Anderson, Stacey J; McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-05-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings.

  18. Land evaluation methodology

    OpenAIRE

    Lustig, Thomas

    1998-01-01

    This paper reviews non-computerised and computerised land evaluation methods or methodologies, and realises the difficulties to incorporate biophysical and socioeconomic factors from different levels. Therefore, this paper theorises an alternative land evaluation approach, which is tested and elaborated in an agricultural community in the North of Chile. The basis of the approach relies on holistic thinking and attempts to evaluate the potential for improving assumed unsustainable goat manage...

  19. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  20. Albert Einstein's Methodology

    OpenAIRE

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Ein...

  1. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  2. Differing antidepressant maintenance methodologies.

    Science.gov (United States)

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  3. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  4. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  5. Internalism as Methodology

    Directory of Open Access Journals (Sweden)

    Terje Lohndal

    2009-10-01

    Full Text Available This paper scrutinizes the recent proposal made by Lassiter (2008 that the dichotomy between Chomskyan internalism and Dummett-type externalism is misguided and should be overcome by an approach that incorporates sociolinguistic concepts such as speakers’ dispositions to defer. We argue that Lassiter’s arguments are flawed and based on a serious misunder-standing of the internalist approach to the study of natural language, failing to appreciate its methodological nature and conclude that Lassiter’s socio-linguistic approach is just another instance of externalist attempts with little hope of scientific achievement.

  6. The New Methodology

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the past few years there's been a rapidly growing interest in“lightweight” methodologies. Alternatively characterized as an antidote to bureaucracy or a license to hack they've stirred up interest all over the software landscape. In this essay I explore the reasons for lightweight methods, focusing not so much on their weight but on their adaptive nature and their people-first orientation . I also give a summary and references to the processes in this school and consider the factors that should influence your choice of whether to go down this newly trodden path.

  7. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  8. Utilización de Sistemas Basados en Reglas y en Casos para diseñar transmisiones por tornillo sinfín // Use of rules based systems and cases based systems for worm gear design

    Directory of Open Access Journals (Sweden)

    Jorge Laureano Moya‐Rodríguez

    2012-01-01

    Full Text Available Las técnicas de Inteligencia Artificial se aplican hoy en día a diferentes problemas de Ingeniería,especialmente los Sistemas Basados en el Conocimiento. Entre estos últimos los más comunes son losSistemas Basados en Patrones, los Sistemas Basados en Reglas, los Sistemas Basados en Casos y losSistemas Híbridos. Los Sistemas Basados en Casos parten de problemas resueltos en un dominio deaplicación y mediante un proceso de adaptación, encuentran la solución a un nuevo problema. Estossistemas pueden ser usados con éxito para el diseño de engranajes, particularmente para el diseño detransmisiones por tornillo sin fin, sin embargo ello constituye un campo de las aplicaciones de laInteligencia Artificial aún inexplorada. En el presente trabajo se hace una comparación del uso de losSistemas Basados en Regla y los Sistemas Basados en Casos para el diseño de transmisiones portornillo sin fin y se muestran los resultados de la aplicación de los sistemas basados en regla al diseñoparticular de una transmisión por tornillo sin fin.Palabras claves: tornillo sin fin, engranajes, sistemas basados en casos, sistemas basados en reglas,inteligencia artificial.____________________________________________________________________________AbstractNowadays Artificial Intelligence techniques are applied successfully to different engineering problems,especially the “Knowledge Based Systems”. Among them the most common are the “Frame basedSystems”, “Rules Based Systems”, “Case Based Systems” and "Hybrid Systems". The “Case BasedSystems” (CBS analyze solved problems in an application domain and by means of a process ofadaptation; they find the solution to a new problem. These systems can be used successfully for thedesign of gears, particularly for designing worm gears; nevertheless it constitutes a field of the applicationsof artificial intelligence even unexplored. A comparison of the use of “Rules Based System” and

  9. 一种基于决策表的分类规则挖掘新算法%A New Algorithm of Mining Classification Rules Based on Decision Table

    Institute of Scientific and Technical Information of China (English)

    谢娟英; 冯德民

    2003-01-01

    The mining of classification rules is an important field in Data Mining. Decision table of rough sets theory is an efficient tool for mining classification rules. The elementary concepts corresponding to decision table of Rough Sets Theory are introduced in this paper. A new algorithm for mining classification rules based on Decision Table is presented, along with a discernable function in reduction of attribute values, and a new principle for accuracy of rules. An example of its application to the car's classification problem is included, and the accuracy of rules discovered is analyzed. The potential fields for its application in data mining are also discussed.

  10. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  11. Albert Einstein's Methodology

    CERN Document Server

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Einstein wrote to Arnold Sommerfeld that Grossman will never claim to be considered a co-discoverer of the Einstein-Grossmann theory. He only helped in guiding Einstein through the mathematical literature, but contributed nothing of substance to the results of the theory. Hence, Einstein neither considered Besso or Grossmann as co-discoverers of the relativity theory which he invented.

  12. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  13. Automatic annotation of auxiliary words usage in rule-based Chinese language%基于规则的现代汉语常用助词用法自动识别

    Institute of Scientific and Technical Information of China (English)

    韩英杰; 昝红英; 张坤丽; 柴玉梅

    2011-01-01

    目前已有的助词研究成果很难直接应用于自然语言处理的机器识别.在现代汉语词典、规则库、语料库“三位一体”的助词知识库基础上,采用基于规则的方法进行了现代汉语常用助词用法的自动识别.对比规则优化前后的实验结果证明,对用法的规则进行细化、扩充和调序可以有效地提高助词用法识别的准确率和召回率,减轻人工标注的工作量,提高大规模语料库的质量.%The existing results of auxiliary word can hardly be used in the automatic annotation of natural language processing. Based on the auxiliary words knowledge database which consists of dictionaries, rule base and corpus base, the rule-based method was used in automatic annotation of auxiliary words usage. The experimental result shows that refining, extending and adjusting the matching order of the rules can promote the precision and recall effectively. It is also helpful to improve the quality of Chinese corpus, deepen the processing depth, and reduce the artificial work.

  14. Achieving process intensification form the application of a phenomena based synthesis, Design and intensification methodology

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Lutze, Philip; Woodley, John

    systematic, efficient and flexible PI methodology covering a wider range of applications which is able to find truly innovative and predictive solutions, not only using knowledge of the existing methods at the Unit-Ops level but also operating at a lower level of aggregation (that is, the phenomena level...... of tasks in the most efficient manner. In step-4, the involved phenomena are aggregated and/or connected using a set of connectivity rules based on the operating windows of each phenomenon. Based on this, a large number of flowsheet options are generated which are subsequently screened for feasibility...... by applying logical and structural constraints. In step-5, the remaining flowsheet options are fast screened by constraints for feasibility and for performance using a set of PI performance metrics. The most promising phenomena-based options are transformed into a unit-operation based flowsheet using a set...

  15. Model-based interpretation of the ECG: a methodology for temporal and spatial reasoning.

    Science.gov (United States)

    Tong, D A; Widman, L E

    1993-06-01

    A new software architecture for automatic interpretation of the electrocardiographic rhythm is presented. Using the hypothesize-and-test paradigm, a semiquantitative physiological model and production rule-based knowledge are combined to reason about time- and space-varying characteristics of complex heart rhythms. A prototype system implementing the methodology accepts a semiquantitative description of the onset and morphology of the P waves and QRS complexes that are observed in the body-surface electrocardiogram. A beat-by-beat explanation of the origin and consequences of each wave is produced. The output is in the standard cardiology laddergram format. The current prototype generates the full differential diagnosis of narrow-complex tachycardia and correctly diagnoses complex rhythms, such as atrioventricular (AV) nodal reentrant tachycardia with either hidden or visible P waves and varying degrees of AV block.

  16. Engineering radioecology: Methodological considerations

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, A.F.; Projaev, V.V. [St. Petersburg State Inst. of Technology (Russian Federation); Sobolev, I.A.; Dmitriev, S.A. [United Ecologo-Technological and Research Center on Radioactive Waste Management and Environmental Remediation, Moscow (Russian Federation)

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  17. Cancer cytogenetics: methodology revisited.

    Science.gov (United States)

    Wan, Thomas S K

    2014-11-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed.

  18. Scientific methodology applied.

    Science.gov (United States)

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  19. Glycaemic index methodology.

    Science.gov (United States)

    Brouns, F; Bjorck, I; Frayn, K N; Gibbs, A L; Lang, V; Slama, G; Wolever, T M S

    2005-06-01

    The glycaemic index (GI) concept was originally introduced to classify different sources of carbohydrate (CHO)-rich foods, usually having an energy content of >80 % from CHO, to their effect on post-meal glycaemia. It was assumed to apply to foods that primarily deliver available CHO, causing hyperglycaemia. Low-GI foods were classified as being digested and absorbed slowly and high-GI foods as being rapidly digested and absorbed, resulting in different glycaemic responses. Low-GI foods were found to induce benefits on certain risk factors for CVD and diabetes. Accordingly it has been proposed that GI classification of foods and drinks could be useful to help consumers make 'healthy food choices' within specific food groups. Classification of foods according to their impact on blood glucose responses requires a standardised way of measuring such responses. The present review discusses the most relevant methodological considerations and highlights specific recommendations regarding number of subjects, sex, subject status, inclusion and exclusion criteria, pre-test conditions, CHO test dose, blood sampling procedures, sampling times, test randomisation and calculation of glycaemic response area under the curve. All together, these technical recommendations will help to implement or reinforce measurement of GI in laboratories and help to ensure quality of results. Since there is current international interest in alternative ways of expressing glycaemic responses to foods, some of these methods are discussed.

  20. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do...... with relational, emotional, and ethical issues associated with interviewing and personal observation. Although the empirical setting of this case is Southeast Asia, the various discussions and interrelatedness of methodology, theory, and empirical reflections will prove applicable to field studies throughout...

  1. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  2. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  3. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  4. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  5. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  6. Methodology of Law and Economics

    NARCIS (Netherlands)

    A.M. Pacces (Alessio Maria); L.T. Visscher (Louis)

    2011-01-01

    textabstractIntroduction A chapter on the methodology of law and economics, i.e. the economic analysis of law, concerns the methodology of economics. The above quote (Becker 1976, 5) shows that economics should not be defined by its subject, but by its method (also Veljanovski 2007, 19). This method

  7. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  8. Choosing a Methodology: Philosophical Underpinning

    Science.gov (United States)

    Jackson, Elizabeth

    2013-01-01

    As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of…

  9. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  10. The integration of geophysical and enhanced Moderate Resolution Imaging Spectroradiometer Normalized Difference Vegetation Index data into a rule-based, piecewise regression-tree model to estimate cheatgrass beginning of spring growth

    Science.gov (United States)

    Boyte, Stephen P.; Wylie, Bruce K.; Major, Donald J.; Brown, Jesslyn F.

    2015-01-01

    Cheatgrass exhibits spatial and temporal phenological variability across the Great Basin as described by ecological models formed using remote sensing and other spatial data-sets. We developed a rule-based, piecewise regression-tree model trained on 99 points that used three data-sets – latitude, elevation, and start of season time based on remote sensing input data – to estimate cheatgrass beginning of spring growth (BOSG) in the northern Great Basin. The model was then applied to map the location and timing of cheatgrass spring growth for the entire area. The model was strong (R2 = 0.85) and predicted an average cheatgrass BOSG across the study area of 29 March–4 April. Of early cheatgrass BOSG areas, 65% occurred at elevations below 1452 m. The highest proportion of cheatgrass BOSG occurred between mid-April and late May. Predicted cheatgrass BOSG in this study matched well with previous Great Basin cheatgrass green-up studies.

  11. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part of the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).

  12. Scoping studies: advancing the methodology

    Directory of Open Access Journals (Sweden)

    O'Brien Kelly K

    2010-09-01

    Full Text Available Abstract Background Scoping studies are an increasingly popular approach to reviewing health research evidence. In 2005, Arksey and O'Malley published the first methodological framework for conducting scoping studies. While this framework provides an excellent foundation for scoping study methodology, further clarifying and enhancing this framework will help support the consistency with which authors undertake and report scoping studies and may encourage researchers and clinicians to engage in this process. Discussion We build upon our experiences conducting three scoping studies using the Arksey and O'Malley methodology to propose recommendations that clarify and enhance each stage of the framework. Recommendations include: clarifying and linking the purpose and research question (stage one; balancing feasibility with breadth and comprehensiveness of the scoping process (stage two; using an iterative team approach to selecting studies (stage three and extracting data (stage four; incorporating a numerical summary and qualitative thematic analysis, reporting results, and considering the implications of study findings to policy, practice, or research (stage five; and incorporating consultation with stakeholders as a required knowledge translation component of scoping study methodology (stage six. Lastly, we propose additional considerations for scoping study methodology in order to support the advancement, application and relevance of scoping studies in health research. Summary Specific recommendations to clarify and enhance this methodology are outlined for each stage of the Arksey and O'Malley framework. Continued debate and development about scoping study methodology will help to maximize the usefulness and rigor of scoping study findings within healthcare research and practice.

  13. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...... and processes in producing the three different ways of generalizing: ideal typologizing, category zooming, and positioning....

  14. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  15. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  16. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  17. No crisis but methodological separatism

    DEFF Research Database (Denmark)

    Erola, Jani; Reimer, David; Räsänen, Pekka;

    2015-01-01

    This article compares methodological trends in nationally and internationally oriented sociology using data from the articles of three Nordic sociological journals: one international (Acta Sociologica), one Finnish (Sosiologia), and one Danish (Dansk Sociologi). The data consists of 943 articles ...

  18. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years) are a...

  19. Some notes on taxonomic methodology

    NARCIS (Netherlands)

    Hammen, van der L.

    1986-01-01

    The present paper constitutes an introduction to taxonomic methodology. After an analysis of taxonomic practice, and a brief survey of kinds of attributes, the paper deals with observation, description, comparison, arrangement and classification, hypothesis construction, deduction, model, experiment

  20. Methodology and Foreground of Metallomics

    Institute of Scientific and Technical Information of China (English)

    He Bin; Jiang Guibin

    2005-01-01

    Metallomics is proposed as a new omics to follow genomics, proteomics and metabolomics. This paper gives an overview of the development of metallomics based on the introduction of the concept of metallomics and its methodology.

  1. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...... recognition of the sociocultural embeddedness of human development, and of the importance to study individuals’ subjective experience, however, calls for adequate methodological procedures that allow for the study of processes of transformation across the life span. The wide range of established procedures...

  2. MOTOR VEHICLE SAFETY RESEARCH METHODOLOGY

    Directory of Open Access Journals (Sweden)

    A. Stepanov

    2015-07-01

    Full Text Available The issues of vehicle safety are considered. The methodology of approach to analyzing and solving the problem of safety management of vehicles and overall traffic is offered. The distinctive features of organization and management of vehicle safety are shown. There has been drawn a conclusion that the methodological approach to solving traffic safety problems is reduced to selection and classification of safety needs.

  3. Methodology of International Law1

    OpenAIRE

    Dominicé, Christian

    2014-01-01

    I. DEFINITION Methodology seeks to define the means of acquiring scientific knowledge. There is no generally accepted definition of the methodology of international law. In this article it will be taken to comprise both its wider meaning of the methods used in the acquisition of a scientific knowledge of the international legal system and its narrower and more specialized meaning, the methods used to determine the existence of norms or rules of international law. The correlation of these two ...

  4. Agile Methodology - Past and Future

    Science.gov (United States)

    2011-05-01

    Agile Methodology – P t d F t ”as an u ure Warren W. Tignor SAIC Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...AND SUBTITLE Agile Methodology - Past and Future 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...Takeuchi & Nonaka HBR 1986, p139 RUGBY Waterfall Red vs Agile Black Team- . - Manifesto 2001 SCRUM GRAPHIC* * Adapted from Schwaber (2007) Agile

  5. 76 FR 71431 - Civil Penalty Calculation Methodology

    Science.gov (United States)

    2011-11-17

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Civil Penalty Calculation Methodology AGENCY: Federal... its civil penalty methodology. Part of this evaluation includes a forthcoming explanation of the... methodology for calculation of certain civil penalties. To induce compliance with federal regulations,...

  6. Q methodology in health economics.

    Science.gov (United States)

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  7. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developinggeneric methodologies for ranking restoration...... techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  8. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  9. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...... shall draw on our extensive experience of design research and design teaching, and on the recent book The Future of Design Methodology, edited by Professor Herbert Birkhofer. We shall also refer to a model that links the Results, Practices, Methods, and Sciences of designing. Some initial conclusions...

  10. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and worksh......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice......, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue...

  11. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  12. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  13. Methodological pluralism and narrative inquiry

    Science.gov (United States)

    Michie, Michael

    2013-09-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on `what meaning is being made' rather than `what is happening here' (quadrant 2 rather than quadrant 1). It is suggested that in using the integral theory model, a qualitative research project focuses primarily on one quadrant and is enhanced by approaches suggested in the other quadrants.

  14. New methodologies for patients rehabilitation.

    Science.gov (United States)

    Fardoun, H M; Mashat, A S; Lange, B

    2015-01-01

    The present editorial is part of the focus theme of Methods of Information in Medicine titled "New Methodologies for Patients Rehabilitation", with a specific focus on technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme explores different dimensions of empowerment methodologies for disabled people in terms of rehabilitation and health care, and to explores the extent to which ICT is a useful tool in this process. The focus theme lists a set of research papers that present different ways of using ICT to develop advanced systems that help disabled people in their rehabilitation process.

  15. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  16. Rule-based machine translation for Aymara

    NARCIS (Netherlands)

    Coler, Matthew; Homola, Petr; Jones, Mari

    2014-01-01

    This paper presents the ongoing result of an approach developed by the collaboration of a computational linguist with a field linguist that addresses one of the oft-overlooked keys to language maintenance: the development of modern language-learning tools. Although machine translation isn’t commonly

  17. Lumpability Abstractions of Rule-based Systems

    Directory of Open Access Journals (Sweden)

    Jerome Feret

    2010-10-01

    Full Text Available The induction of a signaling pathway is characterized by transient complex formation and mutual posttranslational modification of proteins. To faithfully capture this combinatorial process in a mathematical model is an important challenge in systems biology. Exploiting the limited context on which most binding and modification events are conditioned, attempts have been made to reduce the combinatorial complexity by quotienting the reachable set of molecular species, into species aggregates while preserving the deterministic semantics of the thermodynamic limit. Recently we proposed a quotienting that also preserves the stochastic semantics and that is complete in the sense that the semantics of individual species can be recovered from the aggregate semantics. In this paper we prove that this quotienting yields a sufficient condition for weak lumpability and that it gives rise to a backward Markov bisimulation between the original and aggregated transition system. We illustrate the framework on a case study of the EGF/insulin receptor crosstalk.

  18. Rule-based transformations for geometric modelling

    Directory of Open Access Journals (Sweden)

    Thomas Bellet

    2011-02-01

    Full Text Available The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc. with relevant data as their geometric shape (position, curve, surface, etc. or application dedicated data (e.g. molecule concentration level in a biological context. We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes have multiple labels.

  19. Will Rule based BPM obliterate Process Models?

    NARCIS (Netherlands)

    Joosten, S.; Joosten, H.J.M.

    2007-01-01

    Business rules can be used directly for controlling business processes, without reference to a business process model. In this paper we propose to use business rules to specify both business processes and the software that supports them. Business rules expressed in smart mathematical notations bring

  20. Rule-based transformations for geometric modelling

    CERN Document Server

    Bellet, Thomas; Gall, Pascale Le; 10.4204/EPTCS.48.5

    2011-01-01

    The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc.) with relevant data as their geometric shape (position, curve, surface, etc.) or application dedicated data (e.g. molecule concentration level in a biological context). We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes hav...

  1. Autonomous Rule Based Robot Navigation In Orchards

    DEFF Research Database (Denmark)

    Andersen, Jens Christian; Ravn, Ole; Andersen, Nils Axel

    2010-01-01

    Orchard navigation using sensor-based localization and exible mission management facilitates successful missions independent of the Global Positioning System (GPS). This is especially important while driving between tight tree rows where the GPS coverage is poor. This paper suggests localization ......, obstacle avoidance, path planning and drive control. The system is tested successfully using a Hako 20 kW tractor during autonomous missions in both cherry and apple orchards with mission length of up to 2.3 km including the headland turns.......Orchard navigation using sensor-based localization and exible mission management facilitates successful missions independent of the Global Positioning System (GPS). This is especially important while driving between tight tree rows where the GPS coverage is poor. This paper suggests localization...

  2. Lumpability Abstractions of Rule-based Systems

    CERN Document Server

    Feret, Jerome; Koeppl, Heinz; Petrov, Tatjana; 10.4204/EPTCS.40.10

    2010-01-01

    The induction of a signaling pathway is characterized by transient complex formation and mutual posttranslational modification of proteins. To faithfully capture this combinatorial process in a mathematical model is an important challenge in systems biology. Exploiting the limited context on which most binding and modification events are conditioned, attempts have been made to reduce the combinatorial complexity by quotienting the reachable set of molecular species, into species aggregates while preserving the deterministic semantics of the thermodynamic limit. Recently we proposed a quotienting that also preserves the stochastic semantics and that is complete in the sense that the semantics of individual species can be recovered from the aggregate semantics. In this paper we prove that this quotienting yields a sufficient condition for weak lumpability and that it gives rise to a backward Markov bisimulation between the original and aggregated transition system. We illustrate the framework on a case study of...

  3. Methodology

    OpenAIRE

    Köppel, Johannes

    2011-01-01

    The main research question of this thesis is the following: why did Swiss banks and Swiss authorities obediently accepted the dilution of banking privacy in the case of the SWIFT surveillance, when they are usually fierce advocates of banking secrecy? The author initially established three hypotheses: Hypothesis 1 assumes that Switzerland has not opposed the SWIFT program, either publicly or behind the scenes. This implies that Swiss banks and authorities have silently accepted the erosion of...

  4. Computational simulation methodologies for mechanobiological modelling: a cell-centred approach to neointima development in stents.

    Science.gov (United States)

    Boyle, C J; Lennon, A B; Early, M; Kelly, D J; Lally, C; Prendergast, P J

    2010-06-28

    The design of medical devices could be very much improved if robust tools were available for computational simulation of tissue response to the presence of the implant. Such tools require algorithms to simulate the response of tissues to mechanical and chemical stimuli. Available methodologies include those based on the principle of mechanical homeostasis, those which use continuum models to simulate biological constituents, and the cell-centred approach, which models cells as autonomous agents. In the latter approach, cell behaviour is governed by rules based on the state of the local environment around the cell; and informed by experiment. Tissue growth and differentiation requires simulating many of these cells together. In this paper, the methodology and applications of cell-centred techniques--with particular application to mechanobiology--are reviewed, and a cell-centred model of tissue formation in the lumen of an artery in response to the deployment of a stent is presented. The method is capable of capturing some of the most important aspects of restenosis, including nonlinear lesion growth with time. The approach taken in this paper provides a framework for simulating restenosis; the next step will be to couple it with more patient-specific geometries and quantitative parameter data.

  5. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L.

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  6. Research on Automatic Generation of Rule Base in Global Event Management of Hospital%医院全局事件管理中规则库自动生成研究

    Institute of Scientific and Technical Information of China (English)

    李振叶; 孟瑞祺; 刘峰

    2013-01-01

    随着数字化医院的不断建设,医院内部的各种IT系统设备和医疗软件运行维护越来越复杂,基于医院IT系统的事件管理的分析和研究,提出了“全局事件管理”的概念,并创新技术“事件规则库自动生成”,从而对海量事件进行全局的、智能的分析和挖掘。%With the deepening construction of the digital hospital,hospital equipment and medical software running within the various IT systems maintenance is more and more complex, Based on the analysis and research of the hospital IT system management,We have put forward the concept of"global event management", and"event rule base automatically generated", So that we can manage the global mass events, intelligent analysis and research.

  7. Visual Modeling of Combat Entities Behavior Model Rules Based on Finite State Machine%基于有限状态机的作战实体模型行为规则可视化建模

    Institute of Scientific and Technical Information of China (English)

    孙鹏; 谭玉玺; 汤磊

    2015-01-01

    In order to improve the simulation model development efficiency and reduce the maintenance cost of simulation model, according to requirements of the formalized modeling of model rules,the visual expression model of entity behavior rule model based on the finite state machine is put forward. And then this paper designs the visual modeling tool framework of rules based on the finite state machine and conducts a theoretical exploration on the realization of visualization model rules for engineering.%为提升仿真模型开发效率,降低仿真模型的维护成本,从模型规则可视化建模需求入手,提出了基于有限状态机的实体模型行为规则形式化表达模型,并对基于有限状态机的模型规则可视化建模工具框架进行了设计,对模型规则可视化建模的工程实现进行了理论上的探索。

  8. Philosophy, Methodology and Action Research

    Science.gov (United States)

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  9. Unattended Monitoring System Design Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-07-08

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations.

  10. A Probabilistic Ontology Development Methodology

    Science.gov (United States)

    2014-06-01

    Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic

  11. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis

    or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised...

  12. Test reactor risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor.

  13. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection equi

  14. Philosophy, Methodology and Action Research

    Science.gov (United States)

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  15. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  16. The Library Space Utilization Methodology.

    Science.gov (United States)

    Hall, Richard B.

    1978-01-01

    Describes the Library Space Utilization (LSU) methodology, which demonstrates that significant information about the functional requirements of a library can be measured and displayed in a quantitative and graphic form. It measures "spatial" relationships between selected functional divisions; it also determines how many people--staff and…

  17. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  18. Agile Software Methodologies: Strength and Weakness

    OpenAIRE

    Dr. Adel Hamdan Mohammad; Dr. Tariq Alwada’n; Dr. Jafar "M.Ali" Ababneh

    2013-01-01

    Agile methodologies are great software development methodologies. No doubt that these methodologies have widespread reputation. The core of agile methodologies is people. Customer and each team member in agiledevelopment teams are the key success or failure factor in agile process. In this paper authors demonstrate strength and weakness points in agile methodologies. Also authors demonstrate how strength and weakness factors can affect the overall results of agile development process.

  19. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  20. Methodology of Pilot Performance Measurements

    Directory of Open Access Journals (Sweden)

    Peter Kalavsky

    2017-04-01

    Full Text Available The article is devoted to the development of the methodology of measuring pilot performance under real flight conditions. It provides the basic information on a research project realized to obtain new information regarding training and education of pilots. The introduction is focused on the analytical part of the project and the outputs in terms of the current state of the art. Detailed view is cast on the issue of measuring pilot performance under specific conditions of the cockpit or the flight simulator. The article is zooming in on the two selected and developed methods of pilot performance in terms of the defined indicators evaluated, conditions of compliance for conducting research and procedures of the methodology of pilot performance measurements.

  1. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  2. TRAGG best estimate methodology applications

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, H.

    2014-07-01

    The TRACG model simulates a multi-dimensional vessel and contains a flexible modular structure with control system capability. TRACG has undergone benchmarking qualifications with extensive testing and actual plant data. This best estimate methodology has been used widely in BWR safety analyses as well as in the qualification of the GEH advanced BWR designs. The application of TRACG methodology for loss of coolant accident (LOCA) analyses will provide realistic fuel bundle thermal responses. By taking advantage of these additional margins, the utility owner can justify the optimization of plant-specific emergency core cooling system performance requirements and justify certain equipment declared inoperable. The resulting benefits are improved plant capacity and reliability and improved equipment reliability and lifetime. (Author)

  3. The Silver Lining m Methodology

    Directory of Open Access Journals (Sweden)

    Jose Miguel Castillo

    2016-06-01

    Full Text Available The way in which Strategic planning is designed is different depending on the organization. For that reason, no standard procedures can be given to develop Strategic planning. However, the scenarios analysis method is used in any field or organization. We could define a scenario as a set of variables or events that describes a future situation. Additionally, the continuous irruption of new technologies invites us to carry out a revision of the old methodologies and procedures with the intention of starting an innovation process to make them more efficient. The challenge presented in this article consists of the use of the agents technology within a new methodological approach to envisionfuture possible scenarios more quickly and more accurately than the classical methods we currently use.

  4. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  5. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  6. Some methodological issues in biosurveillance.

    Science.gov (United States)

    Fricker, Ronald D

    2011-02-28

    This paper briefly summarizes a short course I gave at the 12th Biennial Centers for Disease Control and Prevention (CDC) and Agency for Toxic Substances and Disease Registry (ATSDR) Symposium held in Decatur, Georgia on April 6, 2009. The goal of this short course was to discuss various methodological issues of biosurveillance detection algorithms, with a focus on the issues related to developing, evaluating, and implementing such algorithms.

  7. Qualitative Research: Methods and Methodology

    OpenAIRE

    Gabb, Jacqui

    2016-01-01

    This entry provides an overview of qualitative LGBTQ research. It begins by mapping out the qualities and character of studies that use this approach with particular attention to psycho-social research. It then highlights how reflexivity, the iterative process of self-identity making, has informed qualitative research, influencing both understandings of sexualities and also the underlying methodologies and research methods used. Finally, it considers how “the everyday” and a practices approac...

  8. [Methods and methodology of pathology].

    Science.gov (United States)

    Lushnikov, E F

    2016-01-01

    The lecture gives the state-of-the-art of the methodology of human pathology that is an area of the scientific and practice activity of specialists to produce and systematize objective knowledge of pathology and to use the knowledge in clinical medicine. It considers the objects and subjects of an investigation, materials and methods of a pathologist, and the results of his/her work.

  9. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  10. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    OpenAIRE

    Tetyana KOVALCHUK

    2016-01-01

    The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted action...

  11. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and worksh......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice......, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  12. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  13. Methodology for ranking restoration options

    Energy Technology Data Exchange (ETDEWEB)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  14. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised.

  15. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  16. Rule based host intrusion defense system research and Implementation%基于规则的主机入侵防御系统的研究与实现

    Institute of Scientific and Technical Information of China (English)

    黄成荣

    2012-01-01

    Host active defense technology is one kind based on single new virus defense technology, by monitoring the process behav- ior, but found the" illegal" behavior, the user is informed, or terminate the process, can achieve the unknown virus prevention. Host intrusion prevention system rule set is the key and difficulty. Starting from the basic rules of structure, rule definition, priority of rules, software restriction strategy and other aspects of host intrusion prevention system of the rule set is studied. And further design and implementation of rule based host intrusion prevention system, experiments show that, the system has a relatively flexible active defence function.%主机主动防御技术就是一种基于单机的新型的病毒防御技术,通过监视进程的行为,一但发现“违规”行为,就通知用户,或者直接终止进程,能够实现对未知病毒的防范。规则设置是主机入侵防御系统的重点和难点。本文从基础规则结构、规则定义、规则优先级、软件限制策略等方面对主机入侵防御系统的规则设置进行了深入研究。并进一步设计实现了基于规则的主机入侵防御系统,实验证明,该系统具有为较为灵活的主动防御功能。

  17. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  18. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...... platform that is familiar to our participants....

  19. Methodological problems in Rorschach research

    Directory of Open Access Journals (Sweden)

    Đurić-Jočić Dragana

    2007-01-01

    Full Text Available Comprehensive System of Rorschach interpretation is considered as nomotetic system that makes possible using of projective method in research projects. However, research use of Rorschach method besides of appropriate knowledge of assign procedures and interpretation rules, means a knowledge of specific methodological issues. The Rorschach indicators are nor independent, as they are a part of specific net, so in some research it is necessary to control basic variables not to get artifacts in our research. This is basically relied on researches where we compare groups, as well as in normative studies where through cross-cultural we compare Rorschach indicators. .

  20. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis;

    The objective of the InnoLabs project is to facilitate cross-sectoral, multidisciplinary solutions to complex social problems in various European settings. InnoLabs are university-driven physical and/or organizational spaces that function as student innovation laboratories and operate as a local...... or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised...

  1. Imaginative methodologies in the social sciences

    DEFF Research Database (Denmark)

    Imaginative Methodologies develops, expands and challenges conventional social scientific methodology and language by way of literary, poetic and other alternative sources of inspiration. Sociologists, social workers, anthropologists, criminologists and psychologists all try to rethink, provoke...

  2. Feminist methodologies and engineering education research

    Science.gov (United States)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  3. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available Rapid development methodologies are popular approaches for the development of modern software systems. The goals of these methodologies are the inclusion of the client into the analysis, design and implementation activities, as well...

  4. Psychology and Critical Methodologies for Educational Research

    OpenAIRE

    Carrasco-Aguilar, Claudia Lorena; Baltar-de Andrade, María Julia

    2015-01-01

    Introduction: This thought piece analyzes the role of psychology in the methodology of socio-critical research in education. Methodology: Articles published in 2010 and 2007 are analyzed, linking psychology to education; a theoretical analysis is also made of the main academic exponents of the concepts of emancipation, resistance and sensitization, seeking to establish research methodologies based on critical theory. Result: A proposal is put forward based on critical research methodologies i...

  5. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  6. Audit Methodology for IT Governance

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE

    2010-01-01

    Full Text Available The continuous development of the new IT technologies was followed up by a rapid integration of them at the organization level. The management of the organizations face a new challenge: structural redefinition of the IT component in order to create plus value and to minimize IT risks through an efficient management of all IT resources of the organization. These changes have had a great impact on the governance of the IT component. The paper proposes an audit methodology of the IT Governance at the organization level. From this point of view the developed audit strategy is a strategy based on risks to enable IT auditor to study from the best angle efficiency and effectiveness of the IT Governance structure. The evaluation of the risks associated with IT Governance is a key process in planning the audit mission which will allow the identification of the segments with increased risks. With now ambition for completeness, the proposed methodology provides the auditor a useful tool in the accomplishment of his mission.

  7. Methodological issues in grounded theory.

    Science.gov (United States)

    Cutcliffe, J R

    2000-06-01

    Examination of the qualitative methodological literature shows that there appear to be conflicting opinions and unresolved issues regarding the nature and process of grounded theory. Researchers proposing to utilize this method would therefore be wise to consider these conflicting opinions. This paper therefore identifies and attempts to address four key issues, namely, sampling, creativity and reflexivity, the use of literature, and precision within grounded theory. The following recommendations are made. When utilizing a grounded method researchers need to consider their research question, clarify what level of theory is likely to be induced from their study, and then decide when they intend to access and introduce the second body of literature. They should acknowledge that in the early stages of data collection, some purposeful sampling appears to occur. In their search for conceptually dense theory, grounded theory researchers may wish to free themselves from the constraints that limit their use of creativity and tacit knowledge. Furthermore, the interests of researchers might be served by attention to issues of precision including, avoiding method slurring, ensuring theoretical coding occurs, and using predominantly one method of grounded theory while explaining and describing any deviation away from this chosen method. Such mindfulness and the resulting methodological rigour is likely to increase the overall quality of the inquiry and enhance the credibility of the findings.

  8. 42 CFR 441.472 - Budget methodology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...

  9. 24 CFR 904.205 - Training methodology.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Training methodology. 904.205... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and...

  10. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  11. 基于文化免疫克隆算法的关联规则挖掘研究%Mining association rules based on cultured immune clone algorithm

    Institute of Scientific and Technical Information of China (English)

    杨光军

    2013-01-01

      针对关联规则挖掘问题,给出一种基于文化免疫克隆算法的关联规则挖掘方法,该方法将免疫克隆算法嵌入到文化算法的框架中,采用双层进化机制,利用免疫克隆算法的智能搜索能力和文化算法信念空间形成的公共认知信念的引导挖掘规则。该方法重新给出了文化算法中状况知识和历史知识的描述,设计了一种变异算子,能够自适应调节变异尺度,提高免疫克隆算法全局搜索能力。实验表明,该算法的运行速度和所得关联规则的准确率优于免疫克隆算法。%For the association rules mining, a method of mining association rules based on cultured immune clone algorithm is proposed. This method uses two-layer evolutionary mechanism and embeds the immune clone algorithm in the culture algorithm framework. It uses the intelligent searching ability of the immune clone algorithm and the commonly accepted knowledge in the culture algorithm to guide the rules mining. The situational knowledge and history knowledge in the culture algorithm are rede-fined, and a new mutation operator is put forward. This operator has the adaptive adjustment of mutation measure to improve the global search ability of immune clone algorithm. The experiments show that the new algorithm is superior to immune clone algo-rithm in performance speed and the rules’accuracy.

  12. Methodology for Benchmarking IPsec Gateways

    Directory of Open Access Journals (Sweden)

    Adam Tisovský

    2012-08-01

    Full Text Available The paper analyses forwarding performance of IPsec gateway over the rage of offered loads. It focuses on the forwarding rate and packet loss particularly at the gateway’s performance peak and at the state of gateway’s overload. It explains possible performance degradation when the gateway is overloaded by excessive offered load. The paper further evaluates different approaches for obtaining forwarding performance parameters – a widely used throughput described in RFC 1242, maximum forwarding rate with zero packet loss and us proposed equilibrium throughput. According to our observations equilibrium throughput might be the most universal parameter for benchmarking security gateways as the others may be dependent on the duration of test trials. Employing equilibrium throughput would also greatly shorten the time required for benchmarking. Lastly, the paper presents methodology and a hybrid step/binary search algorithm for obtaining value of equilibrium throughput.

  13. Messy Methodological Musings: Engaging in

    Directory of Open Access Journals (Sweden)

    ALANA FERGUSON

    2009-10-01

    Full Text Available Participatory research projects incorporating non-traditional, creative, and qualitative methodologies can produce results which are unexpected or divergent from original research proposals. These results are highly meaningful, yet challenging to express to an audience when the expectation is to write the findings in a linear and traditional format, such as in a graduate thesis. Within this article, we use an autoethnographic approach to describe our experiences with ethnodrama, from our perspectives as a graduate student and supervisor. We focus on a planned breast cancer ethnodrama pilot project, which developed into a healing yoga program instead. We question the traditional notion of successful research as being a linear, straightforward process. In doing so, we hope to create dialogue and support mentorship which acknowledges the "messiness" of research projects. We also assert that there is a need to embrace non-traditional methods for disseminating our "messy" research outcomes.

  14. Methodology for flammable gas evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  15. Lean methodology in health care.

    Science.gov (United States)

    Kimsey, Diane B

    2010-07-01

    Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area.

  16. Methodologies for 2011 economic reports

    DEFF Research Database (Denmark)

    Nielsen, Rasmus

    STECF’s Expert Working Group 11-03 convened in Athens (28th March – 1st April, 2011) to discuss and seek agreement on the content, indicators, methodologies and format of the 2011 Annual Economic Reports (AER) on the EU fishing fleet, the fish processing and the aquaculture sectors. Proposals...... for improved contents and the overall structure were discussed. Templates for the national and EU overview chapters for the EU the fish processing and the aquaculture sectors were produced. Indicators for the EU fishing fleet and fish processing reports were reviewed; new indicators for the fish processing...... and the aquaculture sector reports were proposed. And topics of special interest were proposed for all three reports....

  17. Indirect Lightning Safety Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ong, M M; Perkins, M P; Brown, C G; Crull, E W; Streit, R D

    2009-04-24

    Lightning is a safety hazard for high-explosives (HE) and their detonators. In the However, the current flowing from the strike point through the rebar of the building The methodology for estimating the risk from indirect lighting effects will be presented. It has two parts: a method to determine the likelihood of a detonation given a lightning strike, and an approach for estimating the likelihood of a strike. The results of these two parts produce an overall probability of a detonation. The probability calculations are complex for five reasons: (1) lightning strikes are stochastic and relatively rare, (2) the quality of the Faraday cage varies from one facility to the next, (3) RF coupling is inherently a complex subject, (4) performance data for abnormally stressed detonators is scarce, and (5) the arc plasma physics is not well understood. Therefore, a rigorous mathematical analysis would be too complex. Instead, our methodology takes a more practical approach combining rigorous mathematical calculations where possible with empirical data when necessary. Where there is uncertainty, we compensate with conservative approximations. The goal is to determine a conservative estimate of the odds of a detonation. In Section 2, the methodology will be explained. This report will discuss topics at a high-level. The reasons for selecting an approach will be justified. For those interested in technical details, references will be provided. In Section 3, a simple hypothetical example will be given to reinforce the concepts. While the methodology will touch on all the items shown in Figure 1, the focus of this report is the indirect effect, i.e., determining the odds of a detonation from given EM fields. Professor Martin Uman from the University of Florida has been characterizing and defining extreme lightning strikes. Using Professor Uman's research, Dr. Kimball Merewether at Sandia National Laboratory in Albuquerque calculated the EM fields inside a Faraday-cage type

  18. Simulation Enabled Safeguards Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  19. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  20. Clustering Methodologies for Software Engineering

    Directory of Open Access Journals (Sweden)

    Mark Shtern

    2012-01-01

    Full Text Available The size and complexity of industrial strength software systems are constantly increasing. This means that the task of managing a large software project is becoming even more challenging, especially in light of high turnover of experienced personnel. Software clustering approaches can help with the task of understanding large, complex software systems by automatically decomposing them into smaller, easier-to-manage subsystems. The main objective of this paper is to identify important research directions in the area of software clustering that require further attention in order to develop more effective and efficient clustering methodologies for software engineering. To that end, we first present the state of the art in software clustering research. We discuss the clustering methods that have received the most attention from the research community and outline their strengths and weaknesses. Our paper describes each phase of a clustering algorithm separately. We also present the most important approaches for evaluating the effectiveness of software clustering.

  1. Nuclear weapon reliability evaluation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  2. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    Directory of Open Access Journals (Sweden)

    M. S. Leite

    2010-01-01

    Full Text Available This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4 is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS, based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE, lower power consumption, and better recovery of enzyme activity.

  3. RULE-BASED DATA-DRIVEN THEMATIC MAPPING TECHNIQUE AND ITS APPLICATION RESEARCH%基于规则数据驱动的专题制图技术及其应用研究

    Institute of Scientific and Technical Information of China (English)

    曾洪云; 解智强; 王东峰

    2011-01-01

    Cartography automation is one of the important goals to pursue by Map of Workers,and it is one of the hot issues of current research,It geospatial information science is of great significance.In this paper,Kunming,river(pipeline)into the Dianchi lake Thematic cartography project as an example,for the complexity of this project there is the map editor mapping workload and the need to achieve results,and quality requirements of the mapping data,the author uses Representation technology and secondary development based on the actual situation,study rule-based data-driven representation of computer cartography,the process must solve some of the traditional workload by a large number of human editors to complete the mapping task.In particular,realized the need to destroy some of the traditional GIS data attributes can be achieved at the cost effect of the map drawing.in cartography the results showed that:Rule-driven expression of a computer cartography and GIS technology can both map the different requirements for data mapping,map production can be completed quickly and achieve the effect of traditional cartography,can save a lot of manpower and material resources,has broad marketing and engineering applications.%地图制图自动化是地图学的重要目标之一,也是当前研究的热点问题之一,对地球空间信息科学的发展具有十分重要的意义。通过采用Representation技术,以昆明市入滇河道(管线)专题地图制图项目为例,研究探讨基于规则化数据驱动的计算机地图制图表达,部分解决了传统上必须通过大量人工编辑才能够完成的制图任务,特别是实现了传统上需要以破坏数据的GIS属性为代价才能够实现的地图制图效果,结果表明:基于规则驱动的计算机地图制图表达技术能够兼顾GIS和地图制图对数据的不同要求,可快速完成地图制作,并达到传统地图制图效果,节省大量的人力物力,具有广阔的推广和工程应用价值。

  4. 基于条件规则与故障树法的燃气轮机故障诊断%Gas turbine engine fault diagnosis based on conditions of rule-based and fault tree

    Institute of Scientific and Technical Information of China (English)

    尚文; 王维民; 齐鹏逸; 崔津; 曾咏奎

    2013-01-01

    Aiming at all kinds of the gas turbine fault diagnosis problems ,a comprehensive technology of the rule-based and fault tree method was investigated in the gas turbine fault diagnosis research. On the base of the established fault tree of gas turbine, the typical fault cases and maintenance experience were summarized, the based on condition the rules of logic reasoning model was established, the fault analysis principle of gas turbine based on signal processing was utilized, the certain conditions rules were increased in the middle of the fault tree events and bottom events, the physical and logical judge were judged to determine the fault tree on each branch diagnosis choice, thus every step of the fault diagnosis analysis were clear ,the accurate fault causes and failure parts were concluded in the end. For the gas turbine generating unit rotor vibration fault example of an offshore oil operation area, the rapid and accurate the root causes were analyzed by the based on rules conditiongs of fault tree method on the foundation of online monitoring ,The results indicate that the method is convenient in maintenance and technical staff to grasp, it can be widely used in gas turbine generating unit reliability maintenance field.%针对燃气轮机各类故障的诊断问题,将条件规则与故障树法相综合的诊断技术应用到燃气轮机的故障诊断研究中.在建立燃气轮机失效故障树的基础上,通过归纳总结典型的故障案例和维修经验,构建了基于条件规则的逻辑推理模型,利用基于信号处理技术的燃气轮机故障分析原理,在故障树的中间事件和底端事件上,增加了具体故障分析的条件规则,并进行了物理和逻辑判断,以确定故障树每个分支的诊断选择,从而明确了每一步的故障诊断分析,最终得出了精确的故障原因和故障部位.结合某海上石油作业区燃气轮机发电机组的转子振动故障实例,在进行燃气轮

  5. Decision support methodology to establish priorities on the inspection of structures

    Science.gov (United States)

    Cortes, V. Juliette; Sterlacchini, Simone; Bogaard, Thom; Frigerio, Simone; Schenato, Luca; Pasuto, Alessandro

    2014-05-01

    For hydro-meteorological hazards in mountain areas, the regular inspection of check dams and bridges is important due to the effect of their functional status on water-sediment processes. Moreover, the inspection of these structures is time consuming for organizations due to their extensive number in many regions. However, trained citizen-volunteers can support civil protection and technical services in the frequency, timeliness and coverage of monitoring the functional status of hydraulic structures. Technicians should evaluate and validate these reports to get an index for the status of the structure. Thus, preventive actions could initiate such as the cleaning of obstructions or to pre-screen potential problems for a second level inspection. This study proposes a decision support methodology that technicians can use to assess an index for three parameters representing the functional status of the structure: a) condition of the structure at the opening of the stream flow, b) level of obstruction at the structure and c) the level of erosion in the stream bank. The calculation of the index for each parameter is based upon fuzzy logic theory to handle ranges in precision of the reports and to convert the linguistic rating scales into numbers representing the structure's status. A weighting method and multi-criteria method (Analytic Hierarchy Process- AHP and TOPSIS), can be used by technicians to combine the different ratings according to the component elements of the structure and the completeness of the reports. Finally, technicians can set decision rules based on the worst rating and a threshold for the functional indexes. The methodology was implemented as a prototype web-based tool to be tested with technicians of the Civil Protection in the Fella basin, Northern Italy. Results at this stage comprise the design and implementation of the web-based tool with GIS interaction to evaluate available reports and to set priorities on the inspection of structures

  6. CONSULTATION ON UPDATED METHODOLOGY FOR ...

    Science.gov (United States)

    The National Academy of Sciences (NAS) expects to publish the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in calendar year 2005. The committee is expected to have analyzed the most recent epidemiology from the important exposed cohorts and to have factored in any changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee will also consider any relevant radiobiological data, including those from the Department of Energy's low dose effects research program. Based on their evaluation of relevant information, the Committee is then expected to propose a set of models for estimating risks from low-dose ionizing radiation. ORIA will review the BEIR VII report and consider revisions to the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This will be the subject of the Consultation. This project supports a major risk management initiative to improve the basis on which radiation risk decisions are made. This project, funded by several Federal Agencies, reflects an attempt to characterize risks where there are substantial uncertainties. The outcome will improve our ability to assess risks well into the future and will strengthen EPAs overall capability for assessing and managing radiation risks. the BEIR VII report is funde

  7. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  8. Methodological Aspects of Architectural Documentation

    Directory of Open Access Journals (Sweden)

    Arivaldo Amorim

    2011-12-01

    Full Text Available This paper discusses the methodological approach that is being developed in the state of Bahia in Brazil since 2003, in architectural and urban sites documentation, using extensive digital technologies. Bahia has a vast territory with important architectural ensembles ranging from the sixteenth century to present day. As part of this heritage is constructed of raw earth and wood, it is very sensitive to various deleterious agents. It is therefore critical document this collection that is under threats. To conduct those activities diverse digital technologies that could be used in documentation process are being experimented. The task is being developed as an academic research, with few financial resources, by scholarship students and some volunteers. Several technologies are tested ranging from the simplest to the more sophisticated ones, used in the main stages of the documentation project, as follows: work overall planning, data acquisition, processing and management and ultimately, to control and evaluate the work. The activities that motivated this paper are being conducted in the cities of Rio de Contas and Lençóis in the Chapada Diamantina, located at 420 km and 750 km from Salvador respectively, in Cachoeira city at Recôncavo Baiano area, 120 km from Salvador, the capital of Bahia state, and at Pelourinho neighbourhood, located in the historic capital. Part of the material produced can be consulted in the website: < www.lcad.ufba.br>.

  9. Open Government Data Publication Methodology

    Directory of Open Access Journals (Sweden)

    Jan Kucera

    2015-04-01

    Full Text Available Public sector bodies hold a significant amount of data that might be of potential interest to citizens and businesses. However the re-use potential of this data is still largely untapped because the data is not always published in a way that would allow its easy discovery, understanding and re-use. Open Government Data (OGD initiatives aim at increasing availability of machine-readable data provided under an open license and therefore these initiatives might facilitate re-use of the government data which in turn might lead to increased transparency and economic growth. However the recent studies show that still only a portion of data provided by the public sector bodies is truly open. Public sector bodies face a number of challenges when publishing OGD and they need to address the relevant risks. Therefore there is a need for best practices and methodologies for publication of OGD that would provide the responsible persons with a clear guidance on how the OGD initiatives should be implemented and how the known challenges and risks should be addressed.

  10. A Paradigm for Spreadsheet Engineering Methodologies

    CERN Document Server

    Grossman, Thomas A

    2008-01-01

    Spreadsheet engineering methodologies are diverse and sometimes contradictory. It is difficult for spreadsheet developers to identify a spreadsheet engineering methodology that is appropriate for their class of spreadsheet, with its unique combination of goals, type of problem, and available time and resources. There is a lack of well-organized, proven methodologies with known costs and benefits for well-defined spreadsheet classes. It is difficult to compare and critically evaluate methodologies. We present a paradigm for organizing and interpreting spreadsheet engineering recommendations. It systematically addresses the myriad choices made when developing a spreadsheet, and explicitly considers resource constraints and other development parameters. This paradigm provides a framework for evaluation, comparison, and selection of methodologies, and a list of essential elements for developers or codifiers of new methodologies. This paradigm identifies gaps in our knowledge that merit further research.

  11. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  12. Constructivism: a naturalistic methodology for nursing inquiry.

    Science.gov (United States)

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  13. Methodology for assessing systems materials requirements

    Energy Technology Data Exchange (ETDEWEB)

    Culver, D.H.; Teeter, R.R.; Jamieson, W.M.

    1980-01-01

    A potential stumbling block to new system planning and design is imprecise, confusing, or contradictory data regarding materials - their availability and costs. A methodology is now available that removes this barrier by minimizing uncertainties regarding materials availability. Using this methodology, a planner can assess materials requirements more quickly, at lower cost, and with much greater confidence in the results. Developed specifically for energy systems, its potential application is much broader. This methodology and examples of its use are discussed.

  14. THE FUTURE OF LANGUAGE TEACHING METHODOLOGY

    OpenAIRE

    Ted Rodgers

    1998-01-01

    Abstract : This paper reviews the current state of ELT methodology, particulary in respect to a number of current views suggesting that the profession is now in a "post-methods" era in which previous attention to Methods (Total Physical Response, Silent Way, Natural Approach, etc.) has given way to a more generic approach to ELT methodology. Ten potential future courses of ELT methodology are outlines and three of these are considered in some detail. Particular consideration is given as to ho...

  15. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  16. Toward A Practical General Systems Methodological Theory

    OpenAIRE

    Nagib Callaos; Belkis Sánchez de Callaos

    2003-01-01

    Our main purpose in this paper is to describe the way in which we have been relating General System Theory (GST) to practice and to the design of a General Systems Methodology (GSM). Our first step was to apply GST to design a methodology for software development. Then, in a second step, by means of the experience/knowledge learned from applying the methodology to developing specific information systems, a continuous designing and re-designing process started, which simultaneously generalized...

  17. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  18. Grounded theory methodology--narrativity revisited.

    Science.gov (United States)

    Ruppel, Paul Sebastian; Mey, Günter

    2015-06-01

    This article aims to illuminate the role of narrativity in Grounded Theory Methodology and to explore an approach within Grounded Theory Methodology that is sensitized towards aspects of narrativity. The suggested approach takes into account narrativity as an aspect of the underlying data. It reflects how narrativity could be conceptually integrated and systematically used for shaping the way in which coding, category development and the presentation of results in a Grounded Theory Methodology study proceed.

  19. Research Methodologies in Science Education: Qualitative Data.

    Science.gov (United States)

    Libarkin, Julie C.; Kurdziel, Josepha P.

    2002-01-01

    Introduces the concepts and terminology of qualitative research methodologies in the context of science education. Discusses interviewing, observing, validity, reliability, and confirmability. (Author/MM)

  20. Language Teaching Methodology. ERIC Issue Paper.

    Science.gov (United States)

    Rodgers, Theodore S.

    This paper gives an overview of 10 directions language teachers might take in the future. After providing background on the history of language teaching, language teaching methodology is defined and a distinction is made between methodologies and approaches. Next, the 10 scenarios are briefly described. They include the following: teacher/learner…

  1. 10 CFR 766.102 - Calculation methodology.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Calculation methodology. 766.102 Section 766.102 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES Procedures for Special Assessment § 766.102 Calculation methodology. (a...

  2. Research Methodologies and the Doctoral Process.

    Science.gov (United States)

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  3. Embodied Writing: Choreographic Composition as Methodology

    Science.gov (United States)

    Ulmer, Jasmine B.

    2015-01-01

    This paper seeks to examine how embodied methodological approaches might inform dance education practice and research. Through a series of examples, this paper explores how choreographic writing might function as an embodied writing methodology. Here, choreographic writing is envisioned as a form of visual word choreography in which words move,…

  4. Design Methodologies: Industrial and Educational Applications

    NARCIS (Netherlands)

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, Diederick; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies

  5. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological) p

  6. Grounded Theory Methodology: Positivism, Hermeneutics, and Pragmatism

    Science.gov (United States)

    Age, Lars-Johan

    2011-01-01

    Glaserian grounded theory methodology, which has been widely adopted as a scientific methodology in recent decades, has been variously characterised as "hermeneutic" and "positivist." This commentary therefore takes a different approach to characterising grounded theory by undertaking a comprehensive analysis of: (a) the philosophical paradigms of…

  7. 21 CFR 114.90 - Methodology.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methodology. 114.90 Section 114.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION ACIDIFIED FOODS Production and Process Controls § 114.90 Methodology. Methods that may be used...

  8. Research Methodologies and the Doctoral Process.

    Science.gov (United States)

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  9. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    OpenAIRE

    Juan Carlos Rincón-Vásquez; Andrea Velandia-Morales; Idaly Barreto

    2011-01-01

    This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1) Structure Research, 2) Collection of informationand, 3) Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  10. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    Directory of Open Access Journals (Sweden)

    Juan Carlos Rincón-Vásquez

    2011-12-01

    Full Text Available This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1 Structure Research, 2 Collection of informationand, 3 Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  11. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  12. IMPROVEMENT METHODOLOGY FINANCIAL RISK-MANAGEMENT

    OpenAIRE

    E. Kachalova

    2016-01-01

    The article examines the vital issues of improvement methodology financial risk-management. The author reveals the economic essence of the concept of «financial risk-management». Methodological approaches for the efficient management of risks in the system of risk-management in Russia.

  13. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  14. An Empirically Based Methodology for the Nineties.

    Science.gov (United States)

    Nunan, David

    A review of research bearing on second language teaching methodology looks at what it tells about language processing and production, classroom interaction and second language learning, and learning strategy preferences. The perspective taken is that methodology consists of classroom tasks and activities. Implications of the research for the…

  15. Generalized Response Surface Methodology : A New Metaheuristic

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Generalized Response Surface Methodology (GRSM) is a novel general-purpose metaheuristic based on Box and Wilson.s Response Surface Methodology (RSM).Both GRSM and RSM estimate local gradients to search for the optimal solution.These gradients use local first-order polynomials.GRSM, however, uses th

  16. Discipline and Methodology in Higher Education Research

    Science.gov (United States)

    Tight, Malcolm

    2013-01-01

    Higher education research is a multidisciplinary field, engaging researchers from across the academy who make use of a wide range of methodological approaches. This article examines the relation between discipline and methodology in higher education research, analysing a database of 567 articles published in 15 leading higher education journals…

  17. Design methodologies: Industrial and educational applications

    NARCIS (Netherlands)

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, D.; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies f

  18. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  19. Systematic Review Methodology in Higher Education

    Science.gov (United States)

    Bearman, Margaret; Smith, Calvin D.; Carbone, Angela; Slade, Susan; Baik, Chi; Hughes-Warrington, Marnie; Neumann, David L.

    2012-01-01

    Systematic review methodology can be distinguished from narrative reviews of the literature through its emphasis on transparent, structured and comprehensive approaches to searching the literature and its requirement for formal synthesis of research findings. There appears to be relatively little use of the systematic review methodology within the…

  20. IRST testing methodologies: Maritime Infrared Background Simulator

    NARCIS (Netherlands)

    Schwering, P.B.W.

    2006-01-01

    In this paper we discuss methodologies to incorporate the effects of environments and scenarios in the testing of IRST systems. The proposed methodology is based on experience with sea based IRST trials combining the possibilities of performance assessment in required scenarios to the real

  1. Q Methodology, Communication, and the Behavioral Text.

    Science.gov (United States)

    McKeown, Bruce

    1990-01-01

    Discusses Q methodology in light of modern philosophy of science and hermeneutics. Outlines and discusses the basic steps of conducting Q-method research. Suggests that Q methodology allows researchers to understand and interpret the subjective text of respondents without confounding them with external categories of theoretical reflection. (RS)

  2. Solid Waste Management Planning--A Methodology

    Science.gov (United States)

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  3. Active Methodology in the Audiovisual Communication Degree

    Science.gov (United States)

    Gimenez-Lopez, J. L.; Royo, T. Magal; Laborda, Jesus Garcia; Dunai, Larisa

    2010-01-01

    The paper describes the adaptation methods of the active methodologies of the new European higher education area in the new Audiovisual Communication degree under the perspective of subjects related to the area of the interactive communication in Europe. The proposed active methodologies have been experimentally implemented into the new academic…

  4. [Radiotherapy phase I trials' methodology: Features].

    Science.gov (United States)

    Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N

    2016-12-01

    In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  5. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  6. Reliability based design optimization: Formulations and methodologies

    Science.gov (United States)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  7. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  8. Methodology for Rewetting Drained Tropical Peatlands. Approved Verified Carbon Standard (VCS) Methodology VM0027

    NARCIS (Netherlands)

    Hoffer, S.; Laer, Y.; Navrátil, R.; Wosten, J.H.M.

    2014-01-01

    The first methodology to address the rewetting of drained peatlands "Methodology for rewetting Drained Tropical Peatlands" has been approved by the Verified Carbon Standard (VCS) Program. As the methodology is the first of its kind, it will provide unique guidance for other projects that aim at rewe

  9. Design New Robust Self Tuning Fuzzy Backstopping Methodology

    OpenAIRE

    Omid Avatefipour; Farzin Piltan; Mahmoud Reza Safaei Nasrabad; Ghasem Sahamijoo; Alireza Khalilian

    2014-01-01

    This research is focused on proposed Proportional-Integral (PI) like fuzzy adaptive backstopping fuzzy algorithms based on Proportional-Derivative (PD) fuzzy rule base with the adaptation laws derived in the Lyapunov sense. Adaptive SISO PI like fuzzy adaptive backstopping fuzzy method has two main objectives; the first objective is design a SISO fuzzy system to compensate for the model uncertainties of the system, and the second objective is focused on the design PI like fuzzy controller bas...

  10. Using Q Methodology in Quality Improvement Projects.

    Science.gov (United States)

    Tiernon, Paige; Hensel, Desiree; Roy-Ehri, Leah

    Q methodology consists of a philosophical framework and procedures to identify subjective viewpoints that may not be well understood, but its use in nursing is still quite limited. We describe how Q methodology can be used in quality improvement projects to better understand local viewpoints that act as facilitators or barriers to the implementation of evidence-based practice. We describe the use of Q methodology to identify nurses' attitudes about the provision of skin-to-skin care after cesarean birth. Copyright © 2017 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  11. Tourism Methodologies - New Perspectives, Practices and Procedures

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  12. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  13. Design of formulated products: a systematic methodology

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Ng, K.M.

    2011-01-01

    -based computer-aided methodology for design and verification of a class of chemical-based products (liquid formulations) is presented. This methodology is part of an integrated three-stage approach for design/verification of liquid formulations where stage-1 generates a list of feasible product candidates and....../or verifies a specified set through a sequence of predefined activities (work-flow). Stage-2 and stage-3 (not presented here) deal with the planning and execution of experiments, for product validation. Four case studies have been developed to test the methodology. The computer-aided design (stage-1...

  14. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  15. A new Methodology for Operations Strategy

    DEFF Research Database (Denmark)

    Koch, Christian; Rytter, Niels Gorm; Boer, Harry

    2005-01-01

    This paper proposes a new methodology for developing and implementing Operations Strategy (OS). It encompasses both content and process aspects of OS and differs thereby from many of the present OS methodologies. The paper outlines its paradigmatic foundation and presents aim, process, dimensions......, guidelines and required competencies for using it on OS in practice. A case-based action research strategy has been conducted to make a first test and evaluation of the OS methodology and the paper thus provides a case example illustrating its practical unfolding. Finally a discussion is made...

  16. A new Methodology for Operations Strategy

    DEFF Research Database (Denmark)

    Koch, Christian; Rytter, Niels Gorm; Boer, Harry

    2005-01-01

    This paper proposes a new methodology for developing and implementing Operations Strategy (OS). It encompasses both content and process aspects of OS and differs thereby from many of the present OS methodologies. The paper outlines its paradigmatic foundation and presents aim, process, dimensions......, guidelines and required competencies for using it on OS in practice. A case-based action research strategy has been conducted to make a first test and evaluation of the OS methodology and the paper thus provides a case example illustrating its practical unfolding. Finally a discussion is made...

  17. resource allocation methodology for internet heterogeneous traffic

    African Journals Online (AJOL)

    Dr Obe

    RESOURCE ALLOCATION METHODOLOGY FOR INTERNET. HETEROGENEOUS ... control, III this case, involves determining the optimum network resources - in terms ..... Oriented Network Simulator) Designer” a business. Unit of Cadence ...

  18. A Narrative in Search of a Methodology.

    Science.gov (United States)

    Treloar, Anna; Stone, Teresa Elizabeth; McMillan, Margaret; Flakus, Kirstin

    2015-07-01

    Research papers present us with the summaries of scholars' work; what we readers do not see are the struggles behind the decision to choose one methodology over another. A student's mental health portfolio contained a narrative that led to an exploration of the most appropriate methodology for a projected study of clinical anecdotes told by nurses who work in mental health settings to undergraduates and new recruits about mental health nursing. This paper describes the process of struggle, beginning with the student's account, before posing a number of questions needing answers before the choice of the most appropriate methodology. We argue, after discussing the case for the use of literary analysis, discourse analysis, symbolic interactionism, hermeneutics, and narrative research, that case study research is the methodology of choice. Case study is frequently used in educational research and is sufficiently flexible to allow for an exploration of the phenomenon. © 2014 Wiley Periodicals, Inc.

  19. Some Random Thoughts on English Methodology

    Institute of Scientific and Technical Information of China (English)

    1994-01-01

    The college I’m working for is a Junior teacher’s college.Our aim of teaching is to train the students to meet all the standards of qualifide middle school English teachers.Therefore,English Methodology should

  20. Rat sperm motility analysis: methodologic considerations

    Science.gov (United States)

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...