WorldWideScience

Sample records for agent based classification

  1. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  2. Odor Classification using Agent Technology

    Directory of Open Access Journals (Sweden)

    Sigeru OMATU

    2014-03-01

    Full Text Available In order to measure and classify odors, Quartz Crystal Microbalance (QCM can be used. In the present study, seven QCM sensors and three different odors are used. The system has been developed as a virtual organization of agents using an agent platform called PANGEA (Platform for Automatic coNstruction of orGanizations of intElligent Agents. This is a platform for developing open multi-agent systems, specifically those including organizational aspects. The main reason for the use of agents is the scalability of the platform, i.e. the way in which it models the services. The system models functionalities as services inside the agents, or as Service Oriented Approach (SOA architecture compliant services using Web Services. This way the adaptation of the odor classification systems with new algorithms, tools and classification techniques is allowed.

  3. Mass classification in mammography with multi-agent based fusion of human and machine intelligence

    Science.gov (United States)

    Xi, Dongdong; Fan, Ming; Li, Lihua; Zhang, Juan; Shan, Yanna; Dai, Gang; Zheng, Bin

    2016-03-01

    Although the computer-aided diagnosis (CAD) system can be applied for classifying the breast masses, the effects of this method on improvement of the radiologist' accuracy for distinguishing malignant from benign lesions still remain unclear. This study provided a novel method to classify breast masses by integrating the intelligence of human and machine. In this research, 224 breast masses were selected in mammography from database of DDSM with Breast Imaging Reporting and Data System (BI-RADS) categories. Three observers (a senior and a junior radiologist, as well as a radiology resident) were employed to independently read and classify these masses utilizing the Positive Predictive Values (PPV) for each BI-RADS category. Meanwhile, a CAD system was also implemented for classification of these breast masses between malignant and benign. To combine the decisions from the radiologists and CAD, the fusion method of the Multi-Agent was provided. Significant improvements are observed for the fusion system over solely radiologist or CAD. The area under the receiver operating characteristic curve (AUC) of the fusion system increased by 9.6%, 10.3% and 21% compared to that of radiologists with senior, junior and resident level, respectively. In addition, the AUC of this method based on the fusion of each radiologist and CAD are 3.5%, 3.6% and 3.3% higher than that of CAD alone. Finally, the fusion of the three radiologists with CAD achieved AUC value of 0.957, which was 5.6% larger compared to CAD. Our results indicated that the proposed fusion method has better performance than radiologist or CAD alone.

  4. Using an object-based grid system to evaluate a newly developed EP approach to formulate SVMs as applied to the classification of organophosphate nerve agents

    Science.gov (United States)

    Land, Walker H., Jr.; Lewis, Michael; Sadik, Omowunmi; Wong, Lut; Wanekaya, Adam; Gonzalez, Richard J.; Balan, Arun

    2004-04-01

    This paper extends the classification approaches described in reference [1] in the following way: (1.) developing and evaluating a new method for evolving organophosphate nerve agent Support Vector Machine (SVM) classifiers using Evolutionary Programming, (2.) conducting research experiments using a larger database of organophosphate nerve agents, and (3.) upgrading the architecture to an object-based grid system for evaluating the classification of EP derived SVMs. Due to the increased threats of chemical and biological weapons of mass destruction (WMD) by international terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat biochemical warfare. This paper reports the integration of multi-array sensors with Support Vector Machines (SVMs) for the detection of organophosphates nerve agents using a grid computing system called Legion. Grid computing is the use of large collections of heterogeneous, distributed resources (including machines, databases, devices, and users) to support large-scale computations and wide-area data access. Finally, preliminary results using EP derived support vector machines designed to operate on distributed systems have provided accurate classification results. In addition, distributed training time architectures are 50 times faster when compared to standard iterative training time methods.

  5. Pitch Based Sound Classification

    OpenAIRE

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U.

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classif...

  6. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft...

  7. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases the...... classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  8. An Agent Based Classification Model

    OpenAIRE

    Gu, Feng; Aickelin, Uwe; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, p...

  9. Agent-Based Optimization

    CERN Document Server

    Jędrzejowicz, Piotr; Kacprzyk, Janusz

    2013-01-01

    This volume presents a collection of original research works by leading specialists focusing on novel and promising approaches in which the multi-agent system paradigm is used to support, enhance or replace traditional approaches to solving difficult optimization problems. The editors have invited several well-known specialists to present their solutions, tools, and models falling under the common denominator of the agent-based optimization. The book consists of eight chapters covering examples of application of the multi-agent paradigm and respective customized tools to solve  difficult optimization problems arising in different areas such as machine learning, scheduling, transportation and, more generally, distributed and cooperative problem solving.

  10. A new multi criteria classification approach in a multi agent system applied to SEEG analysis.

    OpenAIRE

    Kinie, Abel; Ndiaye, Mamadou Lamine,; Montois, Jean-Jacques; Jacquelet, Yann

    2007-01-01

    This work is focused on the study of the organization of the SEEG signals during epileptic seizures with multi-agent system approach. This approach is based on cooperative mechanisms of auto-organization at the micro level and of emergence of a global function at the macro level. In order to evaluate this approach we propose a distributed collaborative approach for the classification of the interesting signals. This new multi-criteria classification method is able to provide a relevant brain ...

  11. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  12. Biogeography based Satellite Image Classification

    CERN Document Server

    Panchal, V K; Kaur, Navdeep; Kundra, Harish

    2009-01-01

    Biogeography is the study of the geographical distribution of biological organisms. The mindset of the engineer is that we can learn from nature. Biogeography Based Optimization is a burgeoning nature inspired technique to find the optimal solution of the problem. Satellite image classification is an important task because it is the only way we can know about the land cover map of inaccessible areas. Though satellite images have been classified in past by using various techniques, the researchers are always finding alternative strategies for satellite image classification so that they may be prepared to select the most appropriate technique for the feature extraction task in hand. This paper is focused on classification of the satellite image of a particular land cover using the theory of Biogeography based Optimization. The original BBO algorithm does not have the inbuilt property of clustering which is required during image classification. Hence modifications have been proposed to the original algorithm and...

  13. A Novel Approach for Cardiac Disease Prediction and Classification Using Intelligent Agents

    CERN Document Server

    Kuttikrishnan, Murugesan

    2010-01-01

    The goal is to develop a novel approach for cardiac disease prediction and diagnosis using intelligent agents. Initially the symptoms are preprocessed using filter and wrapper based agents. The filter removes the missing or irrelevant symptoms. Wrapper is used to extract the data in the data set according to the threshold limits. Dependency of each symptom is identified using dependency checker agent. The classification is based on the prior and posterior probability of the symptoms with the evidence value. Finally the symptoms are classified in to five classes namely absence, starting, mild, moderate and serious. Using the cooperative approach the cardiac problem is solved and verified.

  14. Classification-based reasoning

    Science.gov (United States)

    Gomez, Fernando; Segami, Carlos

    1991-01-01

    A representation formalism for N-ary relations, quantification, and definition of concepts is described. Three types of conditions are associated with the concepts: (1) necessary and sufficient properties, (2) contingent properties, and (3) necessary properties. Also explained is how complex chains of inferences can be accomplished by representing existentially quantified sentences, and concepts denoted by restrictive relative clauses as classification hierarchies. The representation structures that make possible the inferences are explained first, followed by the reasoning algorithms that draw the inferences from the knowledge structures. All the ideas explained have been implemented and are part of the information retrieval component of a program called Snowy. An appendix contains a brief session with the program.

  15. Agent Based Modelling and Simulation of Social Processes

    OpenAIRE

    Armano Srbljinovic; Ognjen Skunca

    2003-01-01

    The paper provides an introduction to agent-based modelling and simulation of social processes. Reader is introduced to the worldview underlying agent-based models, some basic terminology, basic properties of agent-based models, as well as to what one can and what cannot expect from such models, particularly when they are applied to social-scientific investigation. Special attention is given to the issues of validation. Classification-ACM-1998: J.4 [Computer Applications]; Social and behavior...

  16. Modulation classification based on spectrogram

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The aim of modulation classification (MC) is to identify the modulation type of a communication signal. It plays an important role in many cooperative or noncooperative communication applications. Three spectrogram-based modulation classification methods are proposed. Their reccgnition scope and performance are investigated or evaluated by theoretical analysis and extensive simulation studies. The method taking moment-like features is robust to frequency offset while the other two, which make use of principal component analysis (PCA) with different transformation inputs,can achieve satisfactory accuracy even at low SNR (as low as 2 dB). Due to the properties of spectrogram, the statistical pattern recognition techniques, and the image preprocessing steps, all of our methods are insensitive to unknown phase and frequency offsets, timing errors, and the arriving sequence of symbols.

  17. Projection Classification Based Iterative Algorithm

    Science.gov (United States)

    Zhang, Ruiqiu; Li, Chen; Gao, Wenhua

    2015-05-01

    Iterative algorithm has good performance as it does not need complete projection data in 3D image reconstruction area. It is possible to be applied in BGA based solder joints inspection but with low convergence speed which usually acts with x-ray Laminography that has a worse reconstruction image compared to the former one. This paper explores to apply one projection classification based method which tries to separate the object to three parts, i.e. solute, solution and air, and suppose that the reconstruction speed decrease from solution to two other parts on both side lineally. And then SART and CAV algorithms are improved under the proposed idea. Simulation experiment result with incomplete projection images indicates the fast convergence speed of the improved iterative algorithms and the effectiveness of the proposed method. Less the projection images, more the superiority is also founded.

  18. SAM : Semantic Agent Model for SWRL rule-based agents

    OpenAIRE

    Subercaze, Julien; Maret, Pierre

    2010-01-01

    International audience SemanticWeb technologies are part of multi-agent engineering, especially regarding knowledge base support. Recent advances in the field of logic for the semantic web enable a new range of applications. Among them, programming agents based on semantic rules is a promising field. In this paper we present a semantic agent model that allows SWRL programming of agents. Our approach, based on the extended finite state machine concept, results in a three layers architecture...

  19. From fault classification to fault tolerance for multi-agent systems

    CERN Document Server

    Potiron, Katia; Taillibert, Patrick

    2013-01-01

    Faults are a concern for Multi-Agent Systems (MAS) designers, especially if the MAS are built for industrial or military use because there must be some guarantee of dependability. Some fault classification exists for classical systems, and is used to define faults. When dependability is at stake, such fault classification may be used from the beginning of the system's conception to define fault classes and specify which types of faults are expected. Thus, one may want to use fault classification for MAS; however, From Fault Classification to Fault Tolerance for Multi-Agent Systems argues that

  20. Malware Detection, Supportive Software Agents and Its Classification Schemes

    Directory of Open Access Journals (Sweden)

    Adebayo, Olawale Surajudeen

    2012-12-01

    Full Text Available Over time, the task of curbing the emergence of malware and its dastard activities has been identified interms of analysis, detection and containment of malware. Malware is a general term that is used todescribe the category of malicious software that is part of security threats to the computer and internetsystem. It is a malignant program designed to hamper the effectiveness of a computer and internetsystem. This paper aims at identifying the malware as one of the most dreaded threats to an emergingcomputer and communication technology. The paper identified the category of malware, malwareclassification algorithms, malwares activities and ways of preventing and removing malware if iteventually infects system.The research also describes tools that classify malware dataset using a rule-based classification schemeand machine learning algorithms to detect the malicious program from normal program through patternrecognition.

  1. Review of therapeutic agents for burns pruritus and protocols for management in adult and paediatric patients using the GRADE classification

    Directory of Open Access Journals (Sweden)

    Goutos Ioannis

    2010-10-01

    Full Text Available To review the current evidence on therapeutic agents for burns pruritus and use the Grading of Recommendations, Assessment, Development and Evaluation (GRADE classification to propose therapeutic protocols for adult and paediatric patients. All published interventions for burns pruritus were analysed by a multidisciplinary panel of burns specialists following the GRADE classification to rate individual agents. Following the collation of results and panel discussion, consensus protocols are presented. Twenty-three studies appraising therapeutic agents in the burns literature were identified. The majority of these studies (16 out of 23 are of an observational nature, making an evidence-based approach to defining optimal therapy not feasible. Our multidisciplinary approach employing the GRADE classification recommends the use of antihistamines (cetirizine and cimetidine and gabapentin as the first-line pharmacological agents for both adult and paediatric patients. Ondansetron and loratadine are the second-line medications in our protocols. We additionally recommend a variety of non-pharmacological adjuncts for the perusal of clinicians in order to maximise symptomatic relief in patients troubled with postburn itch. Most studies in the subject area lack sufficient statistical power to dictate a ′gold standard′ treatment agent for burns itch. We encourage clinicians to employ the GRADE system in order to delineate the most appropriate therapeutic approach for burns pruritus until further research elucidates the most efficacious interventions. This widely adopted classification empowers burns clinicians to tailor therapeutic regimens according to current evidence, patient values, risks and resource considerations in different medical environments.

  2. Arabic Text Mining Using Rule Based Classification

    OpenAIRE

    Fadi Thabtah; Omar Gharaibeh; Rashid Al-Zubaidy

    2012-01-01

    A well-known classification problem in the domain of text mining is text classification, which concerns about mapping textual documents into one or more predefined category based on its content. Text classification arena recently attracted many researchers because of the massive amounts of online documents and text archives which hold essential information for a decision-making process. In this field, most of such researches focus on classifying English documents while there are limited studi...

  3. Texture Classification based on Gabor Wavelet

    OpenAIRE

    Amandeep Kaur; Savita Gupta

    2012-01-01

    This paper presents the comparison of Texture classification algorithms based on Gabor Wavelets. The focus of this paper is on feature extraction scheme for texture classification. The texture feature for an image can be classified using texture descriptors. In this paper we have used Homogeneous texture descriptor that uses Gabor Wavelets concept. For texture classification, we have used online texture database that is Brodatz’s database and three advanced well known classifiers: Support Vec...

  4. Domain-Based Classification of CSCW Systems

    Directory of Open Access Journals (Sweden)

    M. Khan

    2011-11-01

    Full Text Available CSCW systems are widely used for group activities in different organizations and setups. This study briefly describes the existing classifications of CSCW systems and their shortcomings. These existing classifications are helpful to categorize systems based on a general set of CSCW characteristics but do not provide any guidance towards system design and evaluation. After literature review of ACM CSCW conference (1986-2010, a new classification is proposed to categorize CSCW systems on the basis of domains. This proposed classification may help researchers to come up with more effective design and evaluation methods for CSCW systems.

  5. Agent Assignment for Process Management: Pattern Based Agent Performance Evaluation

    Science.gov (United States)

    Jablonski, Stefan; Talib, Ramzan

    In almost all workflow management system the role concept is determined once at the introduction of workflow application and is not reevaluated to observe how successfully certain processes are performed by the authorized agents. This paper describes an approach which evaluates how agents are working successfully and feed this information back for future agent assignment to achieve maximum business benefit for the enterprise. The approach is called Pattern based Agent Performance Evaluation (PAPE) and is based on machine learning technique combined with post processing technique. We report on the result of our experiments and discuss issues and improvement of our approach.

  6. Agent Based Individual Traffic Guidance

    DEFF Research Database (Denmark)

    Wanscher, Jørgen

    This thesis investigates the possibilities in applying Operations Research (OR) to autonomous vehicular traffic. The explicit difference to most other research today is that we presume that an agent is present in every vehicle - hence Agent Based Individual Traffic guidance (ABIT). The next...... that the system can be divided into two separate constituents. The immediate dispersion, which is used for small areas and quick response, and the individual alleviation, which considers the longer distance decision support. Both of these require intrinsicate models and cost functions which at the...... beginning of the project were not previously considered. We define a special inseparable cost function and develop a solution complex capable of using this cost function. In relation to calibration and estimation of statistical models used for dynamic route guidance we worked with generating random number...

  7. Texture Classification Based on Texton Features

    Directory of Open Access Journals (Sweden)

    U Ravi Babu

    2012-08-01

    Full Text Available Texture Analysis plays an important role in the interpretation, understanding and recognition of terrain, biomedical or microscopic images. To achieve high accuracy in classification the present paper proposes a new method on textons. Each texture analysis method depends upon how the selected texture features characterizes image. Whenever a new texture feature is derived it is tested whether it precisely classifies the textures. Here not only the texture features are important but also the way in which they are applied is also important and significant for a crucial, precise and accurate texture classification and analysis. The present paper proposes a new method on textons, for an efficient rotationally invariant texture classification. The proposed Texton Features (TF evaluates the relationship between the values of neighboring pixels. The proposed classification algorithm evaluates the histogram based techniques on TF for a precise classification. The experimental results on various stone textures indicate the efficacy of the proposed method when compared to other methods.

  8. Fingerprint Classification based on Orientaion Estimation

    Directory of Open Access Journals (Sweden)

    Manish Mathuria

    2013-06-01

    Full Text Available The geometric characteristics of an object make it distinguishable. The objects present in the Environment known by their features and properties. The fingerprint image as object may classify into sub classes based on minutiae structure. The minutiae structure may categorize as ridge curves generated by the orientation estimation. The extracted curves are invariant to location, rotation and scaling. This classification approach helps to manage fingerprints along their classes. This research provides a better collaboration of data mining based on classification.

  9. Agent Based Individual Traffic guidance

    DEFF Research Database (Denmark)

    Wanscher, Jørgen Bundgaard

    2004-01-01

    vehicle can be obtained through cellular phone tracking or GPS systems. This information can then be used to provide individual traffic guidance as opposed to the mass information systems of today -- dynamic roadsigns and trafficradio. The goal is to achieve better usage of road and time. The main topic......When working with traffic planning or guidance it is common practice to view the vehicles as a combined mass. >From this models are employed to specify the vehicle supply and demand for each region. As the models are complex and the calculations are equally demanding the regions and the detail of...... the road network is aggregated. As a result the calculations reveal only what the mass of vehicles are doing and not what a single vehicle is doing. This is the crucial difference to ABIT (Agent Based Individual Trafficguidance). ABIT is based on the fact that information on the destination of each...

  10. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  11. Texture Classification based on Gabor Wavelet

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2012-07-01

    Full Text Available This paper presents the comparison of Texture classification algorithms based on Gabor Wavelets. The focus of this paper is on feature extraction scheme for texture classification. The texture feature for an image can be classified using texture descriptors. In this paper we have used Homogeneous texture descriptor that uses Gabor Wavelets concept. For texture classification, we have used online texture database that is Brodatz’s database and three advanced well known classifiers: Support Vector Machine, K-nearest neighbor method and decision tree induction method. The results shows that classification using Support vector machines gives better results as compare to the other classifiers. It can accurately discriminate between a testing image data and training data.

  12. Normalization Benefits Microarray-Based Classification

    Directory of Open Access Journals (Sweden)

    Chen Yidong

    2006-01-01

    Full Text Available When using cDNA microarrays, normalization to correct labeling bias is a common preliminary step before further data analysis is applied, its objective being to reduce the variation between arrays. To date, assessment of the effectiveness of normalization has mainly been confined to the ability to detect differentially expressed genes. Since a major use of microarrays is the expression-based phenotype classification, it is important to evaluate microarray normalization procedures relative to classification. Using a model-based approach, we model the systemic-error process to generate synthetic gene-expression values with known ground truth. These synthetic expression values are subjected to typical normalization methods and passed through a set of classification rules, the objective being to carry out a systematic study of the effect of normalization on classification. Three normalization methods are considered: offset, linear regression, and Lowess regression. Seven classification rules are considered: 3-nearest neighbor, linear support vector machine, linear discriminant analysis, regular histogram, Gaussian kernel, perceptron, and multiple perceptron with majority voting. The results of the first three are presented in the paper, with the full results being given on a complementary website. The conclusion from the different experiment models considered in the study is that normalization can have a significant benefit for classification under difficult experimental conditions, with linear and Lowess regression slightly outperforming the offset method.

  13. Normalization Benefits Microarray-Based Classification

    Directory of Open Access Journals (Sweden)

    Edward R. Dougherty

    2006-08-01

    Full Text Available When using cDNA microarrays, normalization to correct labeling bias is a common preliminary step before further data analysis is applied, its objective being to reduce the variation between arrays. To date, assessment of the effectiveness of normalization has mainly been confined to the ability to detect differentially expressed genes. Since a major use of microarrays is the expression-based phenotype classification, it is important to evaluate microarray normalization procedures relative to classification. Using a model-based approach, we model the systemic-error process to generate synthetic gene-expression values with known ground truth. These synthetic expression values are subjected to typical normalization methods and passed through a set of classification rules, the objective being to carry out a systematic study of the effect of normalization on classification. Three normalization methods are considered: offset, linear regression, and Lowess regression. Seven classification rules are considered: 3-nearest neighbor, linear support vector machine, linear discriminant analysis, regular histogram, Gaussian kernel, perceptron, and multiple perceptron with majority voting. The results of the first three are presented in the paper, with the full results being given on a complementary website. The conclusion from the different experiment models considered in the study is that normalization can have a significant benefit for classification under difficult experimental conditions, with linear and Lowess regression slightly outperforming the offset method.

  14. Agent-based enterprise integration

    Energy Technology Data Exchange (ETDEWEB)

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  15. CATS-based Agents That Err

    Science.gov (United States)

    Callantine, Todd J.

    2002-01-01

    This report describes preliminary research on intelligent agents that make errors. Such agents are crucial to the development of novel agent-based techniques for assessing system safety. The agents extend an agent architecture derived from the Crew Activity Tracking System that has been used as the basis for air traffic controller agents. The report first reviews several error taxonomies. Next, it presents an overview of the air traffic controller agents, then details several mechanisms for causing the agents to err in realistic ways. The report presents a performance assessment of the error-generating agents, and identifies directions for further research. The research was supported by the System-Wide Accident Prevention element of the FAA/NASA Aviation Safety Program.

  16. An Agent-Based Distributed Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    J.Li; J.Y.H.Fuh; Y.F.Zhang; A.Y.C.Nee

    2006-01-01

    Agent theories have shown their promising capability in solving distributed complex system ever since its development. In this paper, one multi-agent based distributed product design and manufacturing planning system is presented. The objective of the research is to develop a distributed collaborative design environment for supporting cooperation among the existing engineering functions. In the system, the functional agents for design, manufacturability evaluation,process planning and scheduling are efficiently integrated with a facilitator agent. This paper firstly gives an introduction to the system structure, and the definitions for each executive agent are then described and a prototype of the proposed is also included at the end part.

  17. Malware Classification based on Call Graph Clustering

    OpenAIRE

    Kinable, Joris; Kostakis, Orestis

    2010-01-01

    Each day, anti-virus companies receive tens of thousands samples of potentially harmful executables. Many of the malicious samples are variations of previously encountered malware, created by their authors to evade pattern-based detection. Dealing with these large amounts of data requires robust, automatic detection approaches. This paper studies malware classification based on call graph clustering. By representing malware samples as call graphs, it is possible to abstract certain variations...

  18. Image-based Vehicle Classification System

    CERN Document Server

    Ng, Jun Yee

    2012-01-01

    Electronic toll collection (ETC) system has been a common trend used for toll collection on toll road nowadays. The implementation of electronic toll collection allows vehicles to travel at low or full speed during the toll payment, which help to avoid the traffic delay at toll road. One of the major components of an electronic toll collection is the automatic vehicle detection and classification (AVDC) system which is important to classify the vehicle so that the toll is charged according to the vehicle classes. Vision-based vehicle classification system is one type of vehicle classification system which adopt camera as the input sensing device for the system. This type of system has advantage over the rest for it is cost efficient as low cost camera is used. The implementation of vision-based vehicle classification system requires lower initial investment cost and very suitable for the toll collection trend migration in Malaysia from single ETC system to full-scale multi-lane free flow (MLFF). This project ...

  19. Movie Review Classification and Feature based Summarization of Movie Reviews

    Directory of Open Access Journals (Sweden)

    Sabeeha Mohammed Basheer#1, Syed Farook

    2013-07-01

    Full Text Available Sentiment classification and feature based summarization are essential steps involved with the classification and summarization of movie reviews. The movie review classification is based on sentiment classification and condensed descriptions of movie reviews are generated from the feature based summarization. Experiments are conducted to identify the best machine learning based sentiment classification approach. Latent Semantic Analysis and Latent Dirichlet Allocation were compared to identify features which in turn affects the summary size. The focus of the system design is on classification accuracy and system response time.

  20. Mechanism-based drug exposure classification in pharmacoepidemiological studies

    NARCIS (Netherlands)

    Verdel, B.M.

    2010-01-01

    Mechanism-based classification of drug exposure in pharmacoepidemiological studies In pharmacoepidemiology and pharmacovigilance, the relation between drug exposure and clinical outcomes is crucial. Exposure classification in pharmacoepidemiological studies is traditionally based on pharmacotherapeu

  1. Module-Based Breast Cancer Classification

    OpenAIRE

    Zhang, Yuji; Xuan, Jianhua; Clarke, Robert; Ressom, Habtom W

    2013-01-01

    The reliability and reproducibility of gene biomarkers for classification of cancer patients has been challenged due to measurement noise and biological heterogeneity among patients. In this paper, we propose a novel module-based feature selection framework, which integrates biological network information and gene expression data to identify biomarkers not as individual genes but as functional modules. Results from four breast cancer studies demonstrate that the identified module biomarkers i...

  2. Contextual Deep CNN Based Hyperspectral Classification

    OpenAIRE

    Lee, Hyungtae; Kwon, Heesung

    2016-01-01

    In this paper, we describe a novel deep convolutional neural networks (CNN) based approach called contextual deep CNN that can jointly exploit spatial and spectral features for hyperspectral image classification. The contextual deep CNN first concurrently applies multiple 3-dimensional local convolutional filters with different sizes jointly exploiting spatial and spectral features of a hyperspectral image. The initial spatial and spectral feature maps obtained from applying the variable size...

  3. Web-Based Computing Resource Agent Publishing

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Web-based Computing Resource Publishing is a efficient way to provide additional computing capacity for users who need more computing resources than that they themselves could afford by making use of idle computing resources in the Web.Extensibility and reliability are crucial for agent publishing. The parent-child agent framework and primary-slave agent framework were proposed respectively and discussed in detail.

  4. Detection and classification of organophosphate nerve agent simulants using support vector machines with multiarray sensors.

    Science.gov (United States)

    Sadik, Omowunmi; Land, Walker H; Wanekaya, Adam K; Uematsu, Michiko; Embrechts, Mark J; Wong, Lut; Leibensperger, Dale; Volykin, Alex

    2004-01-01

    The need for rapid and accurate detection systems is expanding and the utilization of cross-reactive sensor arrays to detect chemical warfare agents in conjunction with novel computational techniques may prove to be a potential solution to this challenge. We have investigated the detection, prediction, and classification of various organophosphate (OP) nerve agent simulants using sensor arrays with a novel learning scheme known as support vector machines (SVMs). The OPs tested include parathion, malathion, dichlorvos, trichlorfon, paraoxon, and diazinon. A new data reduction software program was written in MATLAB V. 6.1 to extract steady-state and kinetic data from the sensor arrays. The program also creates training sets by mixing and randomly sorting any combination of data categories into both positive and negative cases. The resulting signals were fed into SVM software for "pairwise" and "one" vs all classification. Experimental results for this new paradigm show a significant increase in classification accuracy when compared to artificial neural networks (ANNs). Three kernels, the S2000, the polynomial, and the Gaussian radial basis function (RBF), were tested and compared to the ANN. The following measures of performance were considered in the pairwise classification: receiver operating curve (ROC) Az indices, specificities, and positive predictive values (PPVs). The ROC Az) values, specifities, and PPVs increases ranged from 5% to 25%, 108% to 204%, and 13% to 54%, respectively, in all OP pairs studied when compared to the ANN baseline. Dichlorvos, trichlorfon, and paraoxon were perfectly predicted. Positive prediction for malathion was 95%. PMID:15032529

  5. Naïve Bayesian Learning based Multi Agent Architecture for Telemedicine

    Directory of Open Access Journals (Sweden)

    Ei Ei Chaw

    2013-04-01

    Full Text Available Agent-based systems are one of the most vibrant and important areas of the research and development to have emerged in Information Technology in recent years. They are one of the most promising approaches for designing and implementing autonomous, intelligent and social software assistants capable of supporting human decision-making. These kinds of systems are believed to be appropriate in many aspects of the healthcare domain. As a result, there is a growing interest of researchers in the application of agent-based techniques to problems in the healthcare domain. The adoption of agent technologies and multi-agent constitutes an emerging area in bioinformatics. Multi-agent based medical diagnosis systems may improve traditionally developed medical computational systems and may also support medical staff in decision-making. In this paper, we simulate the multi agent system for cancer classification. The proposed architecture consists of service provider agents as upper layer agent, coordinator agent as middle layer agent and initial agent lowest layer agent. Coordinator agent serves as matchmaker agent that uses Naïve Bayesian learning method for obtaining general knowledge and selects the best service provider agent using matchmaking mechanism. Therefore this system can reduce the communication overhead between agents for sending messages and transferring data and can avoid sending the problem to irrelevant agents.

  6. Agent-Based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the agent-based modeling and mapping i n manufacturing system, some system models are described in this paper, which are included: Domain Based Hierarchical Structure (DBHS), Cascading Agent Structure (CAS), Proximity Relation Structure (PRS), and Bus-based Network Structure (BNS ). In DBHS, one sort of agents, called static agents, individually acts as Domai n Agents, Resources Agents, UserInterface Agents and Gateway Agents. And the oth ers, named mobile agents, are the brokers of task and ...

  7. Development of a rapid method for the automatic classification of biological agents' fluorescence spectral signatures

    Science.gov (United States)

    Carestia, Mariachiara; Pizzoferrato, Roberto; Gelfusa, Michela; Cenciarelli, Orlando; Ludovici, Gian Marco; Gabriele, Jessica; Malizia, Andrea; Murari, Andrea; Vega, Jesus; Gaudio, Pasquale

    2015-11-01

    Biosecurity and biosafety are key concerns of modern society. Although nanomaterials are improving the capacities of point detectors, standoff detection still appears to be an open issue. Laser-induced fluorescence of biological agents (BAs) has proved to be one of the most promising optical techniques to achieve early standoff detection, but its strengths and weaknesses are still to be fully investigated. In particular, different BAs tend to have similar fluorescence spectra due to the ubiquity of biological endogenous fluorophores producing a signal in the UV range, making data analysis extremely challenging. The Universal Multi Event Locator (UMEL), a general method based on support vector regression, is commonly used to identify characteristic structures in arrays of data. In the first part of this work, we investigate fluorescence emission spectra of different simulants of BAs and apply UMEL for their automatic classification. In the second part of this work, we elaborate a strategy for the application of UMEL to the discrimination of different BAs' simulants spectra. Through this strategy, it has been possible to discriminate between these BAs' simulants despite the high similarity of their fluorescence spectra. These preliminary results support the use of SVR methods to classify BAs' spectral signatures.

  8. Collaborative Representation based Classification for Face Recognition

    CERN Document Server

    Zhang, Lei; Feng, Xiangchu; Ma, Yi; Zhang, David

    2012-01-01

    By coding a query sample as a sparse linear combination of all training samples and then classifying it by evaluating which class leads to the minimal coding residual, sparse representation based classification (SRC) leads to interesting results for robust face recognition. It is widely believed that the l1- norm sparsity constraint on coding coefficients plays a key role in the success of SRC, while its use of all training samples to collaboratively represent the query sample is rather ignored. In this paper we discuss how SRC works, and show that the collaborative representation mechanism used in SRC is much more crucial to its success of face classification. The SRC is a special case of collaborative representation based classification (CRC), which has various instantiations by applying different norms to the coding residual and coding coefficient. More specifically, the l1 or l2 norm characterization of coding residual is related to the robustness of CRC to outlier facial pixels, while the l1 or l2 norm c...

  9. Texture feature based liver lesion classification

    Science.gov (United States)

    Doron, Yeela; Mayer-Wolf, Nitzan; Diamant, Idit; Greenspan, Hayit

    2014-03-01

    Liver lesion classification is a difficult clinical task. Computerized analysis can support clinical workflow by enabling more objective and reproducible evaluation. In this paper, we evaluate the contribution of several types of texture features for a computer-aided diagnostic (CAD) system which automatically classifies liver lesions from CT images. Based on the assumption that liver lesions of various classes differ in their texture characteristics, a variety of texture features were examined as lesion descriptors. Although texture features are often used for this task, there is currently a lack of detailed research focusing on the comparison across different texture features, or their combinations, on a given dataset. In this work we investigated the performance of Gray Level Co-occurrence Matrix (GLCM), Local Binary Patterns (LBP), Gabor, gray level intensity values and Gabor-based LBP (GLBP), where the features are obtained from a given lesion`s region of interest (ROI). For the classification module, SVM and KNN classifiers were examined. Using a single type of texture feature, best result of 91% accuracy, was obtained with Gabor filtering and SVM classification. Combination of Gabor, LBP and Intensity features improved the results to a final accuracy of 97%.

  10. Ladar-based terrain cover classification

    Science.gov (United States)

    Macedo, Jose; Manduchi, Roberto; Matthies, Larry H.

    2001-09-01

    An autonomous vehicle driving in a densely vegetated environment needs to be able to discriminate between obstacles (such as rocks) and penetrable vegetation (such as tall grass). We propose a technique for terrain cover classification based on the statistical analysis of the range data produced by a single-axis laser rangefinder (ladar). We first present theoretical models for the range distribution in the presence of homogeneously distributed grass and of obstacles partially occluded by grass. We then validate our results with real-world cases, and propose a simple algorithm to robustly discriminate between vegetation and obstacles based on the local statistical analysis of the range data.

  11. Assurance in Agent-Based Systems

    International Nuclear Information System (INIS)

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems

  12. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  13. Ecology Based Decentralized Agent Management System

    Science.gov (United States)

    Peysakhov, Maxim D.; Cicirello, Vincent A.; Regli, William C.

    2004-01-01

    The problem of maintaining a desired number of mobile agents on a network is not trivial, especially if we want a completely decentralized solution. Decentralized control makes a system more r e bust and less susceptible to partial failures. The problem is exacerbated on wireless ad hoc networks where host mobility can result in significant changes in the network size and topology. In this paper we propose an ecology-inspired approach to the management of the number of agents. The approach associates agents with living organisms and tasks with food. Agents procreate or die based on the abundance of uncompleted tasks (food). We performed a series of experiments investigating properties of such systems and analyzed their stability under various conditions. We concluded that the ecology based metaphor can be successfully applied to the management of agent populations on wireless ad hoc networks.

  14. Digital image-based classification of biodiesel.

    Science.gov (United States)

    Costa, Gean Bezerra; Fernandes, David Douglas Sousa; Almeida, Valber Elias; Araújo, Thomas Souto Policarpo; Melo, Jessica Priscila; Diniz, Paulo Henrique Gonçalves Dias; Véras, Germano

    2015-07-01

    This work proposes a simple, rapid, inexpensive, and non-destructive methodology based on digital images and pattern recognition techniques for classification of biodiesel according to oil type (cottonseed, sunflower, corn, or soybean). For this, differing color histograms in RGB (extracted from digital images), HSI, Grayscale channels, and their combinations were used as analytical information, which was then statistically evaluated using Soft Independent Modeling by Class Analogy (SIMCA), Partial Least Squares Discriminant Analysis (PLS-DA), and variable selection using the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). Despite good performances by the SIMCA and PLS-DA classification models, SPA-LDA provided better results (up to 95% for all approaches) in terms of accuracy, sensitivity, and specificity for both the training and test sets. The variables selected Successive Projections Algorithm clearly contained the information necessary for biodiesel type classification. This is important since a product may exhibit different properties, depending on the feedstock used. Such variations directly influence the quality, and consequently the price. Moreover, intrinsic advantages such as quick analysis, requiring no reagents, and a noteworthy reduction (the avoidance of chemical characterization) of waste generation, all contribute towards the primary objective of green chemistry. PMID:25882407

  15. BROAD PHONEME CLASSIFICATION USING SIGNAL BASED FEATURES

    Directory of Open Access Journals (Sweden)

    Deekshitha G

    2014-12-01

    Full Text Available Speech is the most efficient and popular means of human communication Speech is produced as a sequence of phonemes. Phoneme recognition is the first step performed by automatic speech recognition system. The state-of-the-art recognizers use mel-frequency cepstral coefficients (MFCC features derived through short time analysis, for which the recognition accuracy is limited. Instead of this, here broad phoneme classification is achieved using features derived directly from the speech at the signal level itself. Broad phoneme classes include vowels, nasals, fricatives, stops, approximants and silence. The features identified useful for broad phoneme classification are voiced/unvoiced decision, zero crossing rate (ZCR, short time energy, most dominant frequency, energy in most dominant frequency, spectral flatness measure and first three formants. Features derived from short time frames of training speech are used to train a multilayer feedforward neural network based classifier with manually marked class label as output and classification accuracy is then tested. Later this broad phoneme classifier is used for broad syllable structure prediction which is useful for applications such as automatic speech recognition and automatic language identification.

  16. Nominated Texture Based Cervical Cancer Classification

    Directory of Open Access Journals (Sweden)

    Edwin Jayasingh Mariarputham

    2015-01-01

    Full Text Available Accurate classification of Pap smear images becomes the challenging task in medical image processing. This can be improved in two ways. One way is by selecting suitable well defined specific features and the other is by selecting the best classifier. This paper presents a nominated texture based cervical cancer (NTCC classification system which classifies the Pap smear images into any one of the seven classes. This can be achieved by extracting well defined texture features and selecting best classifier. Seven sets of texture features (24 features are extracted which include relative size of nucleus and cytoplasm, dynamic range and first four moments of intensities of nucleus and cytoplasm, relative displacement of nucleus within the cytoplasm, gray level cooccurrence matrix, local binary pattern histogram, tamura features, and edge orientation histogram. Few types of support vector machine (SVM and neural network (NN classifiers are used for the classification. The performance of the NTCC algorithm is tested and compared to other algorithms on public image database of Herlev University Hospital, Denmark, with 917 Pap smear images. The output of SVM is found to be best for the most of the classes and better results for the remaining classes.

  17. Diagnosing Learning Disabilities in a Special Education By an Intelligent Agent Based System

    Directory of Open Access Journals (Sweden)

    Khaled Nasser elSayed

    2013-04-01

    Full Text Available The presented paper provides an intelligent agent based classification system for diagnosing and evaluation of learning disabilities with special education students. It provides pedagogy psychology profiles for those students and offer solution strategies with the best educational activities. It provides tools that allow class teachers to discuss psycho functions and basic skills for learning skills, then, performs psycho pedagogy evaluation by comprising a series of strategies in a semantic network knowledge base. The system’s agent performs its classification of student’s disabilities based on its past experience that it got from the exemplars that were classified by expert and acquired in its knowledge base

  18. Agent Based Patient Scheduling Using Heuristic Algorithm

    Directory of Open Access Journals (Sweden)

    Juliet A Murali

    2010-01-01

    Full Text Available This paper describes about an agent based approach to patient scheduling using experience based learning. A heuristic algorithm is also used in the proposed framework. The evaluation on different learning techniques shows that the experience based learning (EBL gives better solution. The processing time decreases as the experience increases. The heuristic algorithm make use of EBL in calculating the processing time. The main objective of this patient scheduling system is to reduce the waiting time of patient in hospitals and to complete their treatment in minimum required time. The framework is implemented in JADE. In this approach the patients and resources are represented as patient agents (PA and resource agents (RA respectively. Even though mathematical model give optimal solution, the computational complexity increases for large size problems. Heuristic solution gives better solution for large size problems. The comparisons of the proposed framework with other scheduling rules shows that an agent based approach to patient scheduling using EBL is better.

  19. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  20. Agent-oriented commonsense knowledge base

    Institute of Scientific and Technical Information of China (English)

    陆汝钤; 石纯一; 张松懋; 毛希平; 徐晋晖; 杨萍; 范路

    2000-01-01

    Common sense processing has been the key difficulty in Al community. Through analyzing various research methods on common sense, a large-scale agent-oriented commonsense knowledge base is described in this paper. We propose a new type of agent——CBS agent, specify common sense oriented semantic network descriptive language-Csnet, augment Prolog for common sense, analyze the ontology structure, and give the execution mechanism of the knowledge base.

  1. Agent-Based Data Integration Framework

    OpenAIRE

    Łukasz Faber

    2014-01-01

    Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult) task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases). Using this multi-agent-based approach in the aspects of the gener...

  2. On Agent-Based Software Engineering

    OpenAIRE

    Jennings, N. R.

    2000-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more generally, Computer Science. It has the potential to significantly improve the theory and the practice of modeling, designing, and implementing computer systems. Yet, to date, there has been little systematic analysis of what makes the agent-based approach such an appealing and powerful computational model. Moreover, even less effort has been devoted to discussing the inherent disadvanta...

  3. A MapReduce based Parallel SVM for Email Classification

    OpenAIRE

    Ke Xu; Cui Wen; Qiong Yuan; Xiangzhu He; Jun Tie

    2014-01-01

    Support Vector Machine (SVM) is a powerful classification and regression tool. Varying approaches including SVM based techniques are proposed for email classification. Automated email classification according to messages or user-specific folders and information extraction from chronologically ordered email streams have become interesting areas in text machine learning research. This paper presents a parallel SVM based on MapReduce (PSMR) algorithm for email classification. We discuss the chal...

  4. Integration of multi-array sensors and support vector machines for the detection and classification of organophosphate nerve agents

    Science.gov (United States)

    Land, Walker H., Jr.; Sadik, Omowunmi A.; Embrechts, Mark J.; Leibensperger, Dale; Wong, Lut; Wanekaya, Adam; Uematsu, Michiko

    2003-08-01

    Due to the increased threats of chemical and biological weapons of mass destruction (WMD) by international terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat biochemical warfare. Furthermore, recent events have highlighted awareness that chemical and biological agents (CBAs) may become the preferred, cheap alternative WMD, because these agents can effectively attack large populations while leaving infrastructures intact. Despite the availability of numerous sensing devices, intelligent hybrid sensors that can detect and degrade CBAs are virtually nonexistent. This paper reports the integration of multi-array sensors with Support Vector Machines (SVMs) for the detection of organophosphates nerve agents using parathion and dichlorvos as model stimulants compounds. SVMs were used for the design and evaluation of new and more accurate data extraction, preprocessing and classification. Experimental results for the paradigms developed using Structural Risk Minimization, show a significant increase in classification accuracy when compared to the existing AromaScan baseline system. Specifically, the results of this research has demonstrated that, for the Parathion versus Dichlorvos pair, when compared to the AromaScan baseline system: (1) a 23% improvement in the overall ROC Az index using the S2000 kernel, with similar improvements with the Gaussian and polynomial (of degree 2) kernels, (2) a significant 173% improvement in specificity with the S2000 kernel. This means that the number of false negative errors were reduced by 173%, while making no false positive errors, when compared to the AromaScan base line performance. (3) The Gaussian and polynomial kernels demonstrated similar specificity at 100% sensitivity. All SVM classifiers provided essentially perfect classification performance for the Dichlorvos versus Trichlorfon pair. For the most difficult classification task, the Parathion versus

  5. Intelligent agent based control of TL-1

    International Nuclear Information System (INIS)

    The Agent based control of complex systems is becoming popular due to its ability to identify the critical situation and its ability to dynamically search for the best available solution to the problem with constrained optimization of the inputs. In this paper we are presenting the architecture of intelligent agent for automatic control of power supplies of TL-1 (Transfer Line 1) to maximise the injection process against the changes in the input beam obtained from Microtron. The paper discusses the results obtained by applying this agent architecture to the accelerator model comprises of Microtron output, TL-1 and booster. (author)

  6. Spatial interactions in agent-based modeling

    CERN Document Server

    Ausloos, Marcel; Merlone, Ugo

    2014-01-01

    Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...

  7. Cirrhosis classification based on texture classification of random features.

    Science.gov (United States)

    Liu, Hui; Shao, Ying; Guo, Dongmei; Zheng, Yuanjie; Zhao, Zuowei; Qiu, Tianshuang

    2014-01-01

    Accurate staging of hepatic cirrhosis is important in investigating the cause and slowing down the effects of cirrhosis. Computer-aided diagnosis (CAD) can provide doctors with an alternative second opinion and assist them to make a specific treatment with accurate cirrhosis stage. MRI has many advantages, including high resolution for soft tissue, no radiation, and multiparameters imaging modalities. So in this paper, multisequences MRIs, including T1-weighted, T2-weighted, arterial, portal venous, and equilibrium phase, are applied. However, CAD does not meet the clinical needs of cirrhosis and few researchers are concerned with it at present. Cirrhosis is characterized by the presence of widespread fibrosis and regenerative nodules in the hepatic, leading to different texture patterns of different stages. So, extracting texture feature is the primary task. Compared with typical gray level cooccurrence matrix (GLCM) features, texture classification from random features provides an effective way, and we adopt it and propose CCTCRF for triple classification (normal, early, and middle and advanced stage). CCTCRF does not need strong assumptions except the sparse character of image, contains sufficient texture information, includes concise and effective process, and makes case decision with high accuracy. Experimental results also illustrate the satisfying performance and they are also compared with typical NN with GLCM. PMID:24707317

  8. Fuzzy Rule Base System for Software Classification

    Directory of Open Access Journals (Sweden)

    Adnan Shaout

    2013-07-01

    Full Text Available Given the central role that software development plays in the delivery and application of informationtechnology, managers have been focusing on process improvement in the software development area. Thisimprovement has increased the demand for software measures, or metrics to manage the process. Thismetrics provide a quantitative basis for the development and validation of models during the softwaredevelopment process. In this paper a fuzzy rule-based system will be developed to classify java applicationsusing object oriented metrics. The system will contain the following features:Automated method to extract the OO metrics from the source code,Default/base set of rules that can be easily configured via XML file so companies, developers, teamleaders,etc, can modify the set of rules according to their needs,Implementation of a framework so new metrics, fuzzy sets and fuzzy rules can be added or removeddepending on the needs of the end user,General classification of the software application and fine-grained classification of the java classesbased on OO metrics, andTwo interfaces are provided for the system: GUI and command.

  9. Agent-based simulation of animal behaviour

    OpenAIRE

    Jonker, C.M.; Treur, J.

    1998-01-01

    In this paper it is shown how animal behaviour can be simulated in an agent-based manner. Different models are shown for different types of behaviour, varying from purely reactive behaviour to pro-active, social and adaptive behaviour. The compositional development method for multi-agent systems DESIRE and its software environment supports the conceptual and detailed design, and execution of these models. Experiments reported in the literature on animal behaviour have been simulated for a num...

  10. PSG-Based Classification of Sleep Phases

    OpenAIRE

    Králík, M.

    2015-01-01

    This work is focused on classification of sleep phases using artificial neural network. The unconventional approach was used for calculation of classification features using polysomnographic data (PSG) of real patients. This approach allows to increase the time resolution of the analysis and, thus, to achieve more accurate results of classification.

  11. Patterns of Use of an Agent-Based Model and a System Dynamics Model: The Application of Patterns of Use and the Impacts on Learning Outcomes

    Science.gov (United States)

    Thompson, Kate; Reimann, Peter

    2010-01-01

    A classification system that was developed for the use of agent-based models was applied to strategies used by school-aged students to interrogate an agent-based model and a system dynamics model. These were compared, and relationships between learning outcomes and the strategies used were also analysed. It was found that the classification system…

  12. Malware Classification based on Call Graph Clustering

    CERN Document Server

    Kinable, Joris

    2010-01-01

    Each day, anti-virus companies receive tens of thousands samples of potentially harmful executables. Many of the malicious samples are variations of previously encountered malware, created by their authors to evade pattern-based detection. Dealing with these large amounts of data requires robust, automatic detection approaches. This paper studies malware classification based on call graph clustering. By representing malware samples as call graphs, it is possible to abstract certain variations away, and enable the detection of structural similarities between samples. The ability to cluster similar samples together will make more generic detection techniques possible, thereby targeting the commonalities of the samples within a cluster. To compare call graphs mutually, we compute pairwise graph similarity scores via graph matchings which approximately minimize the graph edit distance. Next, to facilitate the discovery of similar malware samples, we employ several clustering algorithms, including k-medoids and DB...

  13. Automatic web services classification based on rough set theory

    Institute of Scientific and Technical Information of China (English)

    陈立; 张英; 宋自林; 苗壮

    2013-01-01

    With development of web services technology, the number of existing services in the internet is growing day by day. In order to achieve automatic and accurate services classification which can be beneficial for service related tasks, a rough set theory based method for services classification was proposed. First, the services descriptions were preprocessed and represented as vectors. Elicited by the discernibility matrices based attribute reduction in rough set theory and taking into account the characteristic of decision table of services classification, a method based on continuous discernibility matrices was proposed for dimensionality reduction. And finally, services classification was processed automatically. Through the experiment, the proposed method for services classification achieves approving classification result in all five testing categories. The experiment result shows that the proposed method is accurate and could be used in practical web services classification.

  14. Agent Based Image Segmentation Method : A Review

    OpenAIRE

    Pooja Mishra; Navita Srivastava; Shukla, K. K.; Achintya Singlal

    2011-01-01

    Image segmentation is an important research area in computer vision and many segmentation methods have been proposed. This paper attempts to provide a brief overview of elemental segmentation techniques based on boundary or regional approaches. It focuses mainly on the agent based image segmentation techniques

  15. Behavior-based dual dynamic agent architecture

    Institute of Scientific and Technical Information of China (English)

    仵博; 吴敏; 曹卫华

    2003-01-01

    The objective of the architecture is to make agent promptly and adaptively accomplish tasks in the real-time and dynamic environment. The architecture is composed of elementary level behavior layer and high level be-havior layer. In the elementary level behavior layer, the reactive architecture is introduced to make agent promptlyreact to events; in the high level behavior layer, the deliberation architecture is used to enhance the intelligence ofthe agent. A confidence degree concept is proposed to combine the two layers of the architecture. An agent decisionmaking process is also presented, which is based on the architecture. The results of experiment in RoboSoccer simu-lation team show that the proposed architecture and the decision process are successful.

  16. Agent-based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the gent-based modeling and mapping in m anufacturing system, in this paper, some system models are described, which are including: Domain Based Hierarchical Structure (DBHS), Cascading Agent Struc ture (CAS), Proximity Relation structure (PRS), and Bus-based network structure (BNS). In DBHS, one sort of agent individually delegates Domain Agents, Res ources Agents, UserInterface Agents and Gateway Agents and the other one is a br oker of tasks and process flow. Static agents representing...

  17. Graph-based Methods for Orbit Classification

    Energy Technology Data Exchange (ETDEWEB)

    Bagherjeiran, A; Kamath, C

    2005-09-29

    An important step in the quest for low-cost fusion power is the ability to perform and analyze experiments in prototype fusion reactors. One of the tasks in the analysis of experimental data is the classification of orbits in Poincare plots. These plots are generated by the particles in a fusion reactor as they move within the toroidal device. In this paper, we describe the use of graph-based methods to extract features from orbits. These features are then used to classify the orbits into several categories. Our results show that existing machine learning algorithms are successful in classifying orbits with few points, a situation which can arise in data from experiments.

  18. A MapReduce based Parallel SVM for Email Classification

    Directory of Open Access Journals (Sweden)

    Ke Xu

    2014-06-01

    Full Text Available Support Vector Machine (SVM is a powerful classification and regression tool. Varying approaches including SVM based techniques are proposed for email classification. Automated email classification according to messages or user-specific folders and information extraction from chronologically ordered email streams have become interesting areas in text machine learning research. This paper presents a parallel SVM based on MapReduce (PSMR algorithm for email classification. We discuss the challenges that arise from differences between email foldering and traditional document classification. We show experimental results from an array of automated classification methods and evaluation methodologies, including Naive Bayes, SVM and PSMR method of foldering results on the Enron datasets based on the timeline. By distributing, processing and optimizing the subsets of the training data across multiple participating nodes, the parallel SVM based on MapReduce algorithm reduces the training time significantly

  19. Classification techniques based on AI application to defect classification in cast aluminum

    Science.gov (United States)

    Platero, Carlos; Fernandez, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes the Artificial Intelligent techniques applied to the interpretation process of images from cast aluminum surface presenting different defects. The whole process includes on-line defect detection, feature extraction and defect classification. These topics are discussed in depth through the paper. Data preprocessing process, as well as segmentation and feature extraction are described. At this point, algorithms employed along with used descriptors are shown. Syntactic filter has been developed to modelate the information and to generate the input vector to the classification system. Classification of defects is achieved by means of rule-based systems, fuzzy models and neural nets. Different classification subsystems perform together for the resolution of a pattern recognition problem (hybrid systems). Firstly, syntactic methods are used to obtain the filter that reduces the dimension of the input vector to the classification process. Rule-based classification is achieved associating a grammar to each defect type; the knowledge-base will be formed by the information derived from the syntactic filter along with the inferred rules. The fuzzy classification sub-system uses production rules with fuzzy antecedent and their consequents are ownership rates to every defect type. Different architectures of neural nets have been implemented with different results, as shown along the paper. In the higher classification level, the information given by the heterogeneous systems as well as the history of the process is supplied to an Expert System in order to drive the casting process.

  20. Small Sample Issues for Microarray-Based Classification

    OpenAIRE

    Dougherty, Edward R

    2006-01-01

    In order to study the molecular biological differences between normal and diseased tissues, it is desirable to perform classification among diseases and stages of disease using microarray-based gene-expression values. Owing to the limited number of microarrays typically used in these studies, serious issues arise with respect to the design, performance and analysis of classifiers based on microarray data. This paper reviews some fundamental issues facing small-sample classification: classific...

  1. Gender Classification Based on Geometry Features of Palm Image

    OpenAIRE

    Ming Wu; Yubo Yuan

    2014-01-01

    This paper presents a novel gender classification method based on geometry features of palm image which is simple, fast, and easy to handle. This gender classification method based on geometry features comprises two main attributes. The first one is feature extraction by image processing. The other one is classification system with polynomial smooth support vector machine (PSSVM). A total of 180 palm images were collected from 30 persons to verify the validity of the proposed gender classi...

  2. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  3. DNA sequence analysis using hierarchical ART-based classification networks

    Energy Technology Data Exchange (ETDEWEB)

    LeBlanc, C.; Hruska, S.I. [Florida State Univ., Tallahassee, FL (United States); Katholi, C.R.; Unnasch, T.R. [Univ. of Alabama, Birmingham, AL (United States)

    1994-12-31

    Adaptive resonance theory (ART) describes a class of artificial neural network architectures that act as classification tools which self-organize, work in real-time, and require no retraining to classify novel sequences. We have adapted ART networks to provide support to scientists attempting to categorize tandem repeat DNA fragments from Onchocerca volvulus. In this approach, sequences of DNA fragments are presented to multiple ART-based networks which are linked together into two (or more) tiers; the first provides coarse sequence classification while the sub- sequent tiers refine the classifications as needed. The overall rating of the resulting classification of fragments is measured using statistical techniques based on those introduced to validate results from traditional phylogenetic analysis. Tests of the Hierarchical ART-based Classification Network, or HABclass network, indicate its value as a fast, easy-to-use classification tool which adapts to new data without retraining on previously classified data.

  4. FIPA agent based network distributed control system

    International Nuclear Information System (INIS)

    A control system with the capabilities to combine heterogeneous control systems or processes into a uniform homogeneous environment is discussed. This dynamically extensible system is an example of the software system at the agent level of abstraction. This level of abstraction considers agents as atomic entities that communicate to implement the functionality of the control system. Agents' engineering aspects are addressed by adopting the domain independent software standard, formulated by FIPA. Jade core Java classes are used as a FIPA specification implementation. A special, lightweight, XML RDFS based, control oriented, ontology markup language is developed to standardize the description of the arbitrary control system data processor. Control processes, described in this language, are integrated into the global system at runtime, without actual programming. Fault tolerance and recovery issues are also addressed

  5. FIPA agent based network distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    D. Abbott; V. Gyurjyan; G. Heyes; E. Jastrzembski; C. Timmer; E. Wolin

    2003-03-01

    A control system with the capabilities to combine heterogeneous control systems or processes into a uniform homogeneous environment is discussed. This dynamically extensible system is an example of the software system at the agent level of abstraction. This level of abstraction considers agents as atomic entities that communicate to implement the functionality of the control system. Agents' engineering aspects are addressed by adopting the domain independent software standard, formulated by FIPA. Jade core Java classes are used as a FIPA specification implementation. A special, lightweight, XML RDFS based, control oriented, ontology markup language is developed to standardize the description of the arbitrary control system data processor. Control processes, described in this language, are integrated into the global system at runtime, without actual programming. Fault tolerance and recovery issues are also addressed.

  6. Risk-based classification system of nanomaterials

    International Nuclear Information System (INIS)

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  7. Classification of CMEs Based on Their Dynamics

    Science.gov (United States)

    Nicewicz, J.; Michalek, G.

    2016-05-01

    A large set of coronal mass ejections CMEs (6621) has been selected to study their dynamics seen with the Large Angle and Spectroscopic Coronagraph (LASCO) onboard the Solar and Heliospheric Observatory (SOHO) field of view (LFOV). These events were selected based on having at least six height-time measurements so that their dynamic properties, in the LFOV, can be evaluated with reasonable accuracy. Height-time measurements (in the SOHO/LASCO catalog) were used to determine the velocities and accelerations of individual CMEs at successive distances from the Sun. Linear and quadratic functions were fitted to these data points. On the basis of the best fits to the velocity data points, we were able to classify CMEs into four groups. The types of CMEs do not only have different dynamic behaviors but also different masses, widths, velocities, and accelerations. We also show that these groups of events are initiated by different onset mechanisms. The results of our study allow us to present a consistent classification of CMEs based on their dynamics.

  8. Structure-Based Algorithms for Microvessel Classification

    KAUST Repository

    Smith, Amy F.

    2015-02-01

    © 2014 The Authors. Microcirculation published by John Wiley & Sons Ltd. Objective: Recent developments in high-resolution imaging techniques have enabled digital reconstruction of three-dimensional sections of microvascular networks down to the capillary scale. To better interpret these large data sets, our goal is to distinguish branching trees of arterioles and venules from capillaries. Methods: Two novel algorithms are presented for classifying vessels in microvascular anatomical data sets without requiring flow information. The algorithms are compared with a classification based on observed flow directions (considered the gold standard), and with an existing resistance-based method that relies only on structural data. Results: The first algorithm, developed for networks with one arteriolar and one venular tree, performs well in identifying arterioles and venules and is robust to parameter changes, but incorrectly labels a significant number of capillaries as arterioles or venules. The second algorithm, developed for networks with multiple inlets and outlets, correctly identifies more arterioles and venules, but is more sensitive to parameter changes. Conclusions: The algorithms presented here can be used to classify microvessels in large microvascular data sets lacking flow information. This provides a basis for analyzing the distinct geometrical properties and modelling the functional behavior of arterioles, capillaries, and venules.

  9. Interaction Protocols in Multi-Agent Systems based on Agent Petri Nets Model

    Directory of Open Access Journals (Sweden)

    Kamel Barkaoui

    2013-08-01

    Full Text Available This paper deals with the modeling of interaction between agents in Multi Agents System (MAS based on Agent Petri Nets (APN. Our models are created based on communicating agents. Indeed, an agent initiating a conversation with other can specify the interaction protocol wishes to follow. The combination of APN and FIPA Protocols schemes leads to a set of deployment formal rules for points where model interaction can be successfully implemented. We introduce some models FIPA standard protocols.

  10. Classification problems in object-based representation systems

    OpenAIRE

    Napoli, Amedeo

    1999-01-01

    Classification is a process that consists in two dual operations: generating a set of classes and then classifying given objects into the created classes. The class generation may be understood as a learning process and object classification as a problem-solving process. The goal of this position paper is to introduce and to make precise the notion of a classification problem in object-based representation systems, e.g. a query against a class hierarchy, to define a subsumption relation betwe...

  11. Fuzzy Inference System & Fuzzy Cognitive Maps based Classification

    OpenAIRE

    Kanika Bhutani; Gaurav; Megha Kumar

    2015-01-01

    Fuzzy classification is very necessary because it has the ability to use interpretable rules. It has got control over the limitations of crisp rule based classifiers. This paper mainly deals with classification on the basis of soft computing techniques fuzzy cognitive maps and fuzzy inference system on the lenses dataset. The results obtained with FIS shows 100% accuracy. Sometimes the data available for classification contain missing or ambiguous data so Neutrosophic logic is used for cla...

  12. A new classification algorithm based on RGH-tree search

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper, we put forward a new classification algorithm based on RGH-Tree search and perform the classification analysis and comparison study. This algorithm can save computing resource and increase the classification efficiency. The experiment shows that this algorithm can get better effect in dealing with three dimensional multi-kind data. We find that the algorithm has better generalization ability for small training set and big testing result.

  13. Agent-Based Mobile Event Notification System

    Directory of Open Access Journals (Sweden)

    Rania Fahim El-Gazzar

    2010-10-01

    Full Text Available In recent years, the noticeable move towards using mobile devices (mobile phones and PDAs and wireless technologies have made information available in the context of "anytime, anywhere using any mobile device" experience. Delivering information to mobile devices needs some sort of communication means such as Push, Pull, or mixed (Push and Pull technologies to deliver any chunk of information (events, ads, advisory tips, learning materials, etc.. Events are the most important pieces of information that should be delivered timely wherever the user is. Agent-based technology offers autonomous, flexible, adaptable, and reliable way of delivering events to any device, anywhere, and on time. Publish/subscribe communication model is the basic infrastructure for event-based communication. In this paper, we define the need to mobilize the event notification process in educational environment and the possible categories of event notifications that students can receive from their educational institution. This paper also proposes a framework for agent-based mobile event notification system. The proposed framework is derived from the concept of push–based publish/subscribe communication model but taking advantage from software agents to serve in the mobile environment. Finally, the paper provides a detailed analysis for the proposed system.

  14. Fuzzy classification rules based on similarity

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin; Štefka, D.

    Seňa : PONT s.r.o., 2012 - (Horváth, T.), s. 25-31 ISBN 978-80-971144-0-4. [ITAT 2012. Conference on Theory and Practice of Information Technologies. Ždiar (SK), 17.09.2012-21.09.2012] R&D Projects: GA ČR GA201/08/0802 Institutional support: RVO:67985807 Keywords : classification rules * fuzzy classification * fuzzy integral * fuzzy measure * similarity Subject RIV: IN - Informatics, Computer Science

  15. Agent Based Modeling in Public Administration

    Directory of Open Access Journals (Sweden)

    Osman SEYHAN

    2013-06-01

    Full Text Available This study aims to explore the role of agent based modeling (ABM as a simulation method in analyzing and formulating the policy making processes and modern public management that is under the pressure of information age and socio-politic demands of open societies. ABM is a simulative research method to understand complex adaptive systems (cas from the perspective of its constituent entities. In this study, by employing agent based computing and Netlogo language, twocase studies about organizational design and organizational riskanalyses have been examined. Results revealed that ABM is anefficient platform determining the optimum results from various scenarios in order to understand structures and processes about policy making in both organizational design and risk management. In the future, more researches are needed about understanding role of ABM on understanding and making decision on future of cas especially in conjunction with developments in computer technologies.

  16. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  17. Intelligent Agent-Based System for Digital Library Information Retrieval

    Institute of Scientific and Technical Information of China (English)

    师雪霖; 牛振东; 宋瀚涛; 宋丽哲

    2003-01-01

    A new information search model is reported and the design and implementation of a system based on intelligent agent is presented. The system is an assistant information retrieval system which helps users to search what they need. The system consists of four main components: interface agent, information retrieval agent, broker agent and learning agent. They collaborate to implement system functions. The agents apply learning mechanisms based on an improved ID3 algorithm.

  18. Agent Based Intelligence in a Tetrahedral Rover

    Science.gov (United States)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  19. Agent Based Process Management Environment – Mercury

    OpenAIRE

    Jeong Ah Kim; Seung Young Choi; Rhan Jung

    2007-01-01

    In this article, agent based process management model is proposed, which is for the process management of knowledge worker and service workers in order to establish the basis for the new knowledge management system. In this article, we applied several methods from 6-Sigma and personal software process for personal process definition, process execution and process measurement. This study attempts to improve the process execution accuracy through process visualisation and standardisation and to...

  20. Enhancing the Combat ID Agent Based Model

    OpenAIRE

    Spaans, M.; Petiet, P.J.; Dean, D; Jackson, J.; Bradley, W.; Shan, L. Y.; Ka-Yoon, W.; Yongwei, D.W.; Kai, C.W.

    2007-01-01

    During previous Project Albert and International data Farming Workshops (IDFW) and during discussions between Dstl and TNO, the suitability and feasibility of Agent Based Models (ABMs) to support research on Combat Identification (Combat ID) was examined. The objective of this research is to: Investigate the effect of (a large number of) different variations in Situational Awareness, Situation Awareness (SA), Target Identification (Target ID), Human Factors, and Tactics, Techniques, and Proce...

  1. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498

  2. Agent-based Models of Financial Markets

    OpenAIRE

    Samanidou, E.; E. Zschischang; Stauffer, D.; Lux, T.

    2007-01-01

    This review deals with several microscopic (``agent-based'') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our sel...

  3. Surveillance software based on agents system

    OpenAIRE

    José M. Molina

    2008-01-01

    The Applied Artificial Intelligence group has developed a surveillance system based on the agents theory and multiagents system in the Distributed Artificial Intelligence. The system allows that each data acquisition source could operate independently, but coordinated by a central host. The technology improves the surveillance process, reducing the human attention and introducing automatic alarms. They are looking for technical cooperation of partners interested in the technology.

  4. CATS-based Air Traffic Controller Agents

    Science.gov (United States)

    Callantine, Todd J.

    2002-01-01

    This report describes intelligent agents that function as air traffic controllers. Each agent controls traffic in a single sector in real time; agents controlling traffic in adjoining sectors can coordinate to manage an arrival flow across a given meter fix. The purpose of this research is threefold. First, it seeks to study the design of agents for controlling complex systems. In particular, it investigates agent planning and reactive control functionality in a dynamic environment in which a variety perceptual and decision making skills play a central role. It examines how heuristic rules can be applied to model planning and decision making skills, rather than attempting to apply optimization methods. Thus, the research attempts to develop intelligent agents that provide an approximation of human air traffic controller behavior that, while not based on an explicit cognitive model, does produce task performance consistent with the way human air traffic controllers operate. Second, this research sought to extend previous research on using the Crew Activity Tracking System (CATS) as the basis for intelligent agents. The agents use a high-level model of air traffic controller activities to structure the control task. To execute an activity in the CATS model, according to the current task context, the agents reference a 'skill library' and 'control rules' that in turn execute the pattern recognition, planning, and decision-making required to perform the activity. Applying the skills enables the agents to modify their representation of the current control situation (i.e., the 'flick' or 'picture'). The updated representation supports the next activity in a cycle of action that, taken as a whole, simulates air traffic controller behavior. A third, practical motivation for this research is to use intelligent agents to support evaluation of new air traffic control (ATC) methods to support new Air Traffic Management (ATM) concepts. Current approaches that use large, human

  5. Preliminary Research on Grassland Fine-classification Based on MODIS

    International Nuclear Information System (INIS)

    Grassland ecosystem is important for climatic regulation, maintaining the soil and water. Research on the grassland monitoring method could provide effective reference for grassland resource investigation. In this study, we used the vegetation index method for grassland classification. There are several types of climate in China. Therefore, we need to use China's Main Climate Zone Maps and divide the study region into four climate zones. Based on grassland classification system of the first nation-wide grass resource survey in China, we established a new grassland classification system which is only suitable for this research. We used MODIS images as the basic data resources, and use the expert classifier method to perform grassland classification. Based on the 1:1,000,000 Grassland Resource Map of China, we obtained the basic distribution of all the grassland types and selected 20 samples evenly distributed in each type, then used NDVI/EVI product to summarize different spectral features of different grassland types. Finally, we introduced other classification auxiliary data, such as elevation, accumulate temperature (AT), humidity index (HI) and rainfall. China's nation-wide grassland classification map is resulted by merging the grassland in different climate zone. The overall classification accuracy is 60.4%. The result indicated that expert classifier is proper for national wide grassland classification, but the classification accuracy need to be improved

  6. Classification of Product Requirements Based on Product Environment

    OpenAIRE

    Chen, Zhen Yu; Zeng, Yong

    2006-01-01

    Abstract Effective management of product requirements is critical for designers to deliver a quality design solution in a reasonable range of cost and time. The management depends on a well-defined classification and a flexible representation of product requirements. This article proposes two classification criteria in terms of different partitions of product environment based on a formal structure of produ...

  7. Transportation Mode Choice Analysis Based on Classification Methods

    OpenAIRE

    Zeņina, N; Borisovs, A

    2011-01-01

    Mode choice analysis has received the most attention among discrete choice problems in travel behavior literature. Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. This paper investigates performance of mode choice analysis with classification methods - decision trees, discriminant analysis and multinomial logit. Experimental results have demonstrated satisfactory quality of classification.

  8. A Curriculum-Based Classification System for Community Colleges.

    Science.gov (United States)

    Schuyler, Gwyer

    2003-01-01

    Proposes and tests a community college classification system based on curricular characteristics and their association with institutional characteristics. Seeks readily available data correlates to represent percentage of a college's course offerings that are in the liberal arts. A simple two-category classification system using total enrollment…

  9. An Object-Based Method for Chinese Landform Types Classification

    Science.gov (United States)

    Ding, Hu; Tao, Fei; Zhao, Wufan; Na, Jiaming; Tang, Guo'an

    2016-06-01

    Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM). In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  10. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  11. Shape classification based on singular value decomposition transform

    Institute of Scientific and Technical Information of China (English)

    SHAABAN Zyad; ARIF Thawar; BABA Sami; KREKOR Lala

    2009-01-01

    In this paper, a new shape classification system based on singular value decomposition (SVD) transform using nearest neighbour classifier was proposed. The gray scale image of the shape object was converted into a black and white image. The squared Euclidean distance transform on binary image was applied to extract the boundary image of the shape. SVD transform features were extracted from the the boundary of the object shapes. In this paper, the proposed classification system based on SVD transform feature extraction method was compared with classifier based on moment invariants using nearest neighbour classifier. The experimental results showed the advantage of our proposed classification system.

  12. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Directory of Open Access Journals (Sweden)

    Le Li

    Full Text Available Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions.

  13. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849

  14. Multiclass Classification Based on the Analytical Center of Version Space

    Institute of Scientific and Technical Information of China (English)

    ZENGFanzi; QIUZhengding; YUEJianhai; LIXiangqian

    2005-01-01

    Analytical center machine, based on the analytical center of version space, outperforms support vector machine, especially when the version space is elongated or asymmetric. While analytical center machine for binary classification is well understood, little is known about corresponding multiclass classification.Moreover, considering that the current multiclass classification method: “one versus all” needs repeatedly constructing classifiers to separate a single class from all the others, which leads to daunting computation and low efficiency of classification, and that though multiclass support vector machine corresponds to a simple quadratic optimization, it is not very effective when the version spaceis asymmetric or elongated, Thus, the multiclass classification approach based on the analytical center of version space is proposed to address the above problems. Experiments on wine recognition and glass identification dataset demonstrate validity of the approach proposed.

  15. Behavior Based Social Dimensions Extraction for Multi-Label Classification

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes’ behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes’ connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849

  16. Multiscale agent-based consumer market modeling.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  17. Agent fabrication and its implementation for agent-based electronic commerce

    OpenAIRE

    Guan, Su; Zhu, F.

    2002-01-01

    In the last decade, agent-based e-commerce has emerged as a potential role for the next generation of e-commerce. How to create agents for e-commerce applications has become a serious consideration in this field. This paper proposes a new scheme named agent fabrication and elaborates its implementation in multi-agent systems based on the SAFER (Secure Agent Fabrication, Evolution & Roaming) architecture. First, a conceptual structure is proposed for software agents carrying out e-commerce act...

  18. Implementation of Agent Based Dynamic Distributed Service

    Directory of Open Access Journals (Sweden)

    A.Damodaram

    2010-01-01

    Full Text Available The concept of distributed computing implies a network / internet-work of independent nodes which are logically configured in such a manner as to be seen as one machine by an application. They have been implemented in many varying forms and configurations, for the optimal processing of data. Agents and multi-agent systems are useful in modeling complex distributed processes. They focus on support for (the development of large-scale, secure, and heterogeneous distributed systems. They are expected to abstract both hardware and software vis-à-vis distributed systems. For optimizing the use of the tremendous increase in processing power, bandwidth, and memory that technology is placing in the hands of the designer, a Dynamically Distributed Service (to be positioned as a service to a network / internet-work is proposed. The service will conceptually migrate an application on to different nodes. In this paper, we present the design and implementation of an inter-mobility (migration mechanism for agents. This migration is based on FIPA ACL messages. We also evaluate the performance of this implementation.

  19. Program Classification for Performance-Based Budgeting

    OpenAIRE

    Robinson, Marc

    2013-01-01

    This guide provides practical guidance on program classification, that is, on how to define programs and their constituent elements under a program budgeting system. Program budgeting is the most widespread form of performance budgeting as applied to the government budget as a whole. The defining characteristics of program budgeting are: (1) funds are allocated in the budget to results-bas...

  20. A Fuzzy Logic Based Sentiment Classification

    Directory of Open Access Journals (Sweden)

    J.I.Sheeba

    2014-07-01

    Full Text Available Sentiment classification aims to detect information such as opinions, explicit , implicit feelings expressed in text. The most existing approaches are able to detect either explicit expressions or implicit expressions of sentiments in the text separately. In this proposed framework it will detect both Implicit and Explicit expressions available in the meeting transcripts. It will classify the Positive, Negative, Neutral words and also identify the topic of the particular meeting transcripts by using fuzzy logic. This paper aims to add some additional features for improving the classification method. The quality of the sentiment classification is improved using proposed fuzzy logic framework .In this fuzzy logic it includes the features like Fuzzy rules and Fuzzy C-means algorithm.The quality of the output is evaluated using the parameters such as precision, recall, f-measure. Here Fuzzy C-means Clustering technique measured in terms of Purity and Entropy. The data set was validated using 10-fold cross validation method and observed 95% confidence interval between the accuracy values .Finally, the proposed fuzzy logic method produced more than 85 % accurate results and error rate is very less compared to existing sentiment classification techniques.

  1. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  2. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  3. Multiclass cancer classification based on gene expression comparison

    OpenAIRE

    Yang Sitan; Naiman Daniel Q.

    2014-01-01

    As the complexity and heterogeneity of cancer is being increasingly appreciated through genomic analyses, microarray-based cancer classification comprising multiple discriminatory molecular markers is an emerging trend. Such multiclass classification problems pose new methodological and computational challenges for developing novel and effective statistical approaches. In this paper, we introduce a new approach for classifying multiple disease states associated with cancer based on gene expre...

  4. Network planning tool based on network classification and load prediction

    OpenAIRE

    Hammami, Seif eddine; Afifi, Hossam; Marot, Michel; Gauthier, Vincent

    2016-01-01

    Real Call Detail Records (CDR) are analyzed and classified based on Support Vector Machine (SVM) algorithm. The daily classification results in three traffic classes. We use two different algorithms, K-means and SVM to check the classification efficiency. A second support vector regression (SVR) based algorithm is built to make an online prediction of traffic load using the history of CDRs. Then, these algorithms will be integrated to a network planning tool which will help cellular operators...

  5. Iris Image Classification Based on Hierarchical Visual Codebook.

    Science.gov (United States)

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection. PMID:26353275

  6. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  7. A Classification-based Review Recommender

    Science.gov (United States)

    O'Mahony, Michael P.; Smyth, Barry

    Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.

  8. Text document classification based on mixture models

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Malík, Antonín

    2004-01-01

    Roč. 40, č. 3 (2004), s. 293-304. ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004

  9. Fast Wavelet-Based Visual Classification

    OpenAIRE

    Yu, Guoshen; Slotine, Jean-Jacques

    2008-01-01

    We investigate a biologically motivated approach to fast visual classification, directly inspired by the recent work of Serre et al. Specifically, trading-off biological accuracy for computational efficiency, we explore using wavelet and grouplet-like transforms to parallel the tuning of visual cortex V1 and V2 cells, alternated with max operations to achieve scale and translation invariance. A feature selection procedure is applied during learning to accelerate recognition. We introduce a si...

  10. Blurred Image Classification based on Adaptive Dictionary

    OpenAIRE

    Xiaofei Zhou; Guangling Sun; Jie Yin

    2012-01-01

    Two frameworks for blurred image classification bas ed on adaptive dictionary are proposed. Given a blurred image, instead of image deblurring, the sem antic category of the image is determined by blur insensitive sparse coefficients calculated dependin g on an adaptive dictionary. The dictionary is adap tive to an assumed space invariant Point Spread Function (PSF) estimated from the input blurred image. In o ne of th...

  11. A classification-based review recommender

    OpenAIRE

    O'Mahony, Michael P.; Smyth, Barry

    2010-01-01

    Many online stores encourage their users to submit product or service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare...

  12. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  13. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  14. Hybrid Support Vector Machines-Based Multi-fault Classification

    Institute of Scientific and Technical Information of China (English)

    GAO Guo-hua; ZHANG Yong-zhong; ZHU Yu; DUAN Guang-huang

    2007-01-01

    Support Vector Machines (SVM) is a new general machine-learning tool based on structural risk minimization principle. This characteristic is very signific ant for the fault diagnostics when the number of fault samples is limited. Considering that SVM theory is originally designed for a two-class classification, a hybrid SVM scheme is proposed for multi-fault classification of rotating machinery in our paper. Two SVM strategies, 1-v-1 (one versus one) and 1-v-r (one versus rest), are respectively adopted at different classification levels. At the parallel classification level, using 1-v-1 strategy, the fault features extracted by various signal analysis methods are transferred into the multiple parallel SVM and the local classification results are obtained. At the serial classification level, these local results values are fused by one serial SVM based on 1-v-r strategy. The hybrid SVM scheme introduced in our paper not only generalizes the performance of signal binary SVMs but improves the precision and reliability of the fault classification results. The actually testing results show the availability suitability of this new method.

  15. Agent Communication Channel Based on BACnet

    Institute of Scientific and Technical Information of China (English)

    Jiang Wen-bin; Zhou Man-li

    2004-01-01

    We analyze the common shortcoming in the existing agent MTPs (message transport protocols). With employing the File object and related service AtomicWriteFile of BACnet (a data communication protocol building automation and control networks), a new method of agent message transport is proposed and implemented. Every agent platform (AP) has one specified File object and agents in another AP can communicate with agents in the AP by using AtomicWriteFile service. Agent messages can be in a variety of formats. In implementation, BACnet/IP and Ethernet are applied as the BACnet data link layers respectively. The experiment results show that the BACnet can provide perfect support for agent communication like other conventional protocols such as hypertext transfer protocol(HTTP), remote method invocation (RMI) etc. and has broken through the restriction of TCP/IP. By this approach, the agent technology is introduced into the building automation control network system.

  16. Support vector classification algorithm based on variable parameter linear programming

    Institute of Scientific and Technical Information of China (English)

    Xiao Jianhua; Lin Jian

    2007-01-01

    To solve the problems of SVM in dealing with large sample size and asymmetric distributed samples, a support vector classification algorithm based on variable parameter linear programming is proposed.In the proposed algorithm, linear programming is employed to solve the optimization problem of classification to decrease the computation time and to reduce its complexity when compared with the original model.The adjusted punishment parameter greatly reduced the classification error resulting from asymmetric distributed samples and the detailed procedure of the proposed algorithm is given.An experiment is conducted to verify whether the proposed algorithm is suitable for asymmetric distributed samples.

  17. Words semantic orientation classification based on HowNet

    Institute of Scientific and Technical Information of China (English)

    LI Dun; MA Yong-tao; GUO Jian-li

    2009-01-01

    Based on the text orientation classification, a new measurement approach to semantic orientation of words was proposed. According to the integrated and detailed definition of words in HowNet, seed sets including the words with intense orientations were built up. The orientation similarity between the seed words and the given word was then calculated using the sentiment weight priority to recognize the semantic orientation of common words. Finally, the words' semantic orientation and the context were combined to recognize the given words' orientation. The experiments show that the measurement approach achieves better results for common words' orientation classification and contributes particularly to the text orientation classification of large granularities.

  18. Agent-based models of financial markets

    International Nuclear Information System (INIS)

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  19. Agent-based models of financial markets

    Science.gov (United States)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  20. Agent-based models of financial markets

    Energy Technology Data Exchange (ETDEWEB)

    Samanidou, E [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany); Zschischang, E [HSH Nord Bank, Portfolio Mngmt. and Inv., Martensdamm 6, D-24103 Kiel (Germany); Stauffer, D [Institute for Theoretical Physics, Cologne University, D-50923 Koeln (Germany); Lux, T [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany)

    2007-03-15

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we

  1. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  2. Population Control for Multi-agent Based Topical Crawlers

    OpenAIRE

    Mouton, Alban; Marteau, Pierre-François

    2008-01-01

    International audience The use of multi-agent topical Web crawlers based on the endogenous fitness model raises the problem of controling the population of agents. We tackle this question through an energy based model to balance the reproduction/life expectency of agents. Our goal is to simplify the tuning of parameters and to optimize the use of ressources available for the crawling. We introduce an energy based model designed to control the number of agents according to the precision of ...

  3. Construct validity of agent-based simulation of normative behaviour

    OpenAIRE

    Xenitidou, M; Elsenbroich, C

    2011-01-01

    In this paper we assess the construct validity and theoretical emdeddedness of agent-based models of normative behaviour drawing on experimental social psychology. We contend that social psychology and agent-based modelling share the focus of ‘observing’ the processes and outcomes of the interaction of individual agents. The paper focuses on two from a taxonomy of agent-based models of normative behaviour. This enables the identification of the assumptions the models are built on and in turn,...

  4. Feature Extraction based Face Recognition, Gender and Age Classification

    Directory of Open Access Journals (Sweden)

    Venugopal K R

    2010-01-01

    Full Text Available The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are located by using Canny edge operator and face recognition is performed. Based on the texture and shape information gender and age classification is done using Posteriori Class Probability and Artificial Neural Network respectively. It is observed that the face recognition is 100%, the gender and age classification is around 98% and 94% respectively.

  5. A Human Gait Classification Method Based on Radar Doppler Spectrograms

    Directory of Open Access Journals (Sweden)

    Fok Hing Chi Tivive

    2010-01-01

    Full Text Available An image classification technique, which has recently been introduced for visual pattern recognition, is successfully applied for human gait classification based on radar Doppler signatures depicted in the time-frequency domain. The proposed method has three processing stages. The first two stages are designed to extract Doppler features that can effectively characterize human motion based on the nature of arm swings, and the third stage performs classification. Three types of arm motion are considered: free-arm swings, one-arm confined swings, and no-arm swings. The last two arm motions can be indicative of a human carrying objects or a person in stressed situations. The paper discusses the different steps of the proposed method for extracting distinctive Doppler features and demonstrates their contributions to the final and desirable classification rates.

  6. A NOVEL RULE-BASED FINGERPRINT CLASSIFICATION APPROACH

    Directory of Open Access Journals (Sweden)

    Faezeh Mirzaei

    2014-03-01

    Full Text Available Fingerprint classification is an important phase in increasing the speed of a fingerprint verification system and narrow down the search of fingerprint database. Fingerprint verification is still a challenging problem due to the difficulty of poor quality images and the need for faster response. The classification gets even harder when just one core has been detected in the input image. This paper has proposed a new classification approach which includes the images with one core. The algorithm extracts singular points (core and deltas from the input image and performs classification based on the number, locations and surrounded area of the detected singular points. The classifier is rule-based, where the rules are generated independent of a given data set. Moreover, shortcomings of a related paper has been reported in detail. The experimental results and comparisons on FVC2002 database have shown the effectiveness and efficiency of the proposed method.

  7. Analysis of Kernel Approach in Fuzzy-Based Image Classifications

    Directory of Open Access Journals (Sweden)

    Mragank Singhal

    2013-03-01

    Full Text Available This paper presents a framework of kernel approach in the field of fuzzy based image classification in remote sensing. The goal of image classification is to separate images according to their visual content into two or more disjoint classes. Fuzzy logic is relatively young theory. Major advantage of this theory is that it allows the natural description, in linguistic terms, of problems that should be solved rather than in terms of relationships between precise numerical values. This paper describes how remote sensing data with uncertainty are handled with fuzzy based classification using Kernel approach for land use/land cover maps generation. The introduction to fuzzification using Kernel approach provides the basis for the development of more robust approaches to the remote sensing classification problem. The kernel explicitly defines a similarity measure between two samples and implicitly represents the mapping of the input space to the feature space.

  8. Bazhenov Fm Classification Based on Wireline Logs

    Science.gov (United States)

    Simonov, D. A.; Baranov, V.; Bukhanov, N.

    2016-03-01

    This paper considers the main aspects of Bazhenov Formation interpretation and application of machine learning algorithms for the Kolpashev type section of the Bazhenov Formation, application of automatic classification algorithms that would change the scale of research from small to large. Machine learning algorithms help interpret the Bazhenov Formation in a reference well and in other wells. During this study, unsupervised and supervised machine learning algorithms were applied to interpret lithology and reservoir properties. This greatly simplifies the routine problem of manual interpretation and has an economic effect on the cost of laboratory analysis.

  9. PLANNING BASED ON CLASSIFICATION BY INDUCTION GRAPH

    Directory of Open Access Journals (Sweden)

    Sofia Benbelkacem

    2013-11-01

    Full Text Available In Artificial Intelligence, planning refers to an area of research that proposes to develop systems that can automatically generate a result set, in the form of an integrated decisionmaking system through a formal procedure, known as plan. Instead of resorting to the scheduling algorithms to generate plans, it is proposed to operate the automatic learning by decision tree to optimize time. In this paper, we propose to build a classification model by induction graph from a learning sample containing plans that have an associated set of descriptors whose values change depending on each plan. This model will then operate for classifying new cases by assigning the appropriate plan.

  10. A Novel Fault Classification Scheme Based on Least Square SVM

    OpenAIRE

    Dubey, Harishchandra; Tiwari, A. K.; Nandita; Ray, P. K.; Mohanty, S. R.; Kishor, Nand

    2016-01-01

    This paper presents a novel approach for fault classification and section identification in a series compensated transmission line based on least square support vector machine. The current signal corresponding to one-fourth of the post fault cycle is used as input to proposed modular LS-SVM classifier. The proposed scheme uses four binary classifier; three for selection of three phases and fourth for ground detection. The proposed classification scheme is found to be accurate and reliable in ...

  11. Feature Extraction based Face Recognition, Gender and Age Classification

    OpenAIRE

    Venugopal K R2; L M Patnaik; Ramesha K; K B Raja

    2010-01-01

    The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC) algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are loc...

  12. From Agents to Continuous Change via Aesthetics: Learning Mechanics with Visual Agent-Based Computational Modeling

    Science.gov (United States)

    Sengupta, Pratim; Farris, Amy Voss; Wright, Mason

    2012-01-01

    Novice learners find motion as a continuous process of change challenging to understand. In this paper, we present a pedagogical approach based on agent-based, visual programming to address this issue. Integrating agent-based programming, in particular, Logo programming, with curricular science has been shown to be challenging in previous research…

  13. Knowledge Management in Role Based Agents

    Science.gov (United States)

    Kır, Hüseyin; Ekinci, Erdem Eser; Dikenelli, Oguz

    In multi-agent system literature, the role concept is getting increasingly researched to provide an abstraction to scope beliefs, norms, goals of agents and to shape relationships of the agents in the organization. In this research, we propose a knowledgebase architecture to increase applicability of roles in MAS domain by drawing inspiration from the self concept in the role theory of sociology. The proposed knowledgebase architecture has granulated structure that is dynamically organized according to the agent's identification in a social environment. Thanks to this dynamic structure, agents are enabled to work on consistent knowledge in spite of inevitable conflicts between roles and the agent. The knowledgebase architecture is also implemented and incorporated into the SEAGENT multi-agent system development framework.

  14. Agents-based distributed processes control systems

    Directory of Open Access Journals (Sweden)

    Adrian Gligor

    2011-12-01

    Full Text Available Large industrial distributed systems have revealed a remarkable development in recent years. We may note an increase of their structural and functional complexity, at the same time with those on requirements side. These are some reasons why there are involvednumerous researches, energy and resources to solve problems related to these types of systems. The paper addresses the issue of industrial distributed systems with special attention being given to the distributed industrial processes control systems. A solution for a distributed process control system based on mobile intelligent agents is presented.The main objective of the proposed system is to provide an optimal solution in terms of costs, maintenance, reliability and flexibility. The paper focuses on requirements, architecture, functionality and advantages brought by the proposed solution.

  15. AGENT BASED INTRUSION DETECTION SYSTEM IN MANET

    Directory of Open Access Journals (Sweden)

    J. K. Mandal

    2013-02-01

    Full Text Available In this paper a technique for intrusion detection in MANET has been proposed where agents are fired from a node which traverses each node randomly and detect the malicious node. Detection is based on triangular encryption technique (TE where AODV is taken as routing protocol. For simulation we have taken NS2 (2.33 where two type of parameters are considered out of which number of nodes and percentage of node mobility are the attributes. For analysis purpose 20, 30, 30, 40, 50 and 60 nodes are taken with a variable percentage of malicious node as 0 %( no malicious, 10%, 20%, 30% and 40%. Analysis have been done taking generated packets, forwarded packets, delay, and average delay as parameters

  16. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  17. Classification approach based on association rules mining for unbalanced data

    CERN Document Server

    Ndour, Cheikh

    2012-01-01

    This paper deals with the supervised classification when the response variable is binary and its class distribution is unbalanced. In such situation, it is not possible to build a powerful classifier by using standard methods such as logistic regression, classification tree, discriminant analysis, etc. To overcome this short-coming of these methods that provide classifiers with low sensibility, we tackled the classification problem here through an approach based on the association rules learning because this approach has the advantage of allowing the identification of the patterns that are well correlated with the target class. Association rules learning is a well known method in the area of data-mining. It is used when dealing with large database for unsupervised discovery of local patterns that expresses hidden relationships between variables. In considering association rules from a supervised learning point of view, a relevant set of weak classifiers is obtained from which one derives a classification rule...

  18. Ensemble polarimetric SAR image classification based on contextual sparse representation

    Science.gov (United States)

    Zhang, Lamei; Wang, Xiao; Zou, Bin; Qiao, Zhijun

    2016-05-01

    Polarimetric SAR image interpretation has become one of the most interesting topics, in which the construction of the reasonable and effective technique of image classification is of key importance. Sparse representation represents the data using the most succinct sparse atoms of the over-complete dictionary and the advantages of sparse representation also have been confirmed in the field of PolSAR classification. However, it is not perfect, like the ordinary classifier, at different aspects. So ensemble learning is introduced to improve the issue, which makes a plurality of different learners training and obtained the integrated results by combining the individual learner to get more accurate and ideal learning results. Therefore, this paper presents a polarimetric SAR image classification method based on the ensemble learning of sparse representation to achieve the optimal classification.

  19. Blurred Image Classification Based on Adaptive Dictionary

    Directory of Open Access Journals (Sweden)

    Guangling Sun

    2013-02-01

    Full Text Available Two frameworks for blurred image classification bas ed on adaptive dictionary are proposed. Given a blurred image, instead of image deblurring, the sem antic category of the image is determined by blur insensitive sparse coefficients calculated dependin g on an adaptive dictionary. The dictionary is adap tive to an assumed space invariant Point Spread Function (PSF estimated from the input blurred image. In o ne of the proposed two frameworks, the PSF is inferred separately and in the other, the PSF is updated combined with sparse coefficients calculation in an alternative and iterative manner. The experimental results have evaluated three types of blur namely d efocus blur, simple motion blur and camera shake bl ur. The experiment results confirm the effectiveness of the proposed frameworks.

  20. Classification of LiDAR Data with Point Based Classification Methods

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2016-06-01

    LiDAR is one of the most effective systems for 3 dimensional (3D) data collection in wide areas. Nowadays, airborne LiDAR data is used frequently in various applications such as object extraction, 3D modelling, change detection and revision of maps with increasing point density and accuracy. The classification of the LiDAR points is the first step of LiDAR data processing chain and should be handled in proper way since the 3D city modelling, building extraction, DEM generation, etc. applications directly use the classified point clouds. The different classification methods can be seen in recent researches and most of researches work with the gridded LiDAR point cloud. In grid based data processing of the LiDAR data, the characteristic point loss in the LiDAR point cloud especially vegetation and buildings or losing height accuracy during the interpolation stage are inevitable. In this case, the possible solution is the use of the raw point cloud data for classification to avoid data and accuracy loss in gridding process. In this study, the point based classification possibilities of the LiDAR point cloud is investigated to obtain more accurate classes. The automatic point based approaches, which are based on hierarchical rules, have been proposed to achieve ground, building and vegetation classes using the raw LiDAR point cloud data. In proposed approaches, every single LiDAR point is analyzed according to their features such as height, multi-return, etc. then automatically assigned to the class which they belong to. The use of un-gridded point cloud in proposed point based classification process helped the determination of more realistic rule sets. The detailed parameter analyses have been performed to obtain the most appropriate parameters in the rule sets to achieve accurate classes. The hierarchical rule sets were created for proposed Approach 1 (using selected spatial-based and echo-based features) and Approach 2 (using only selected spatial-based features

  1. An Agent-Based Analysis of Tax Compliance for Turkey

    OpenAIRE

    M. Oguz ARSLAN; Ozgur Ican

    2013-01-01

    An agent-based tax compliance model for Turkey is developed in this paper. In this model, four kinds of agent archetypes as honest, strategic, defiant, and random are employed. The model is used for simulating evolutionary changes in tax compliance behavior of a population of 10,000 taxpayer agents. The implementation of the model via four simulation scenarios points out that an agent-based evolutionary strategy simulation for Turkish case is valid. Also, the neighbourhood effect is not found...

  2. Pathological Bases for a Robust Application of Cancer Molecular Classification

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2015-04-01

    Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.

  3. ELABORATION OF A VECTOR BASED SEMANTIC CLASSIFICATION OVER THE WORDS AND NOTIONS OF THE NATURAL LANGUAGE

    OpenAIRE

    Safonov, K.; Lichargin, D.

    2009-01-01

    The problem of vector-based semantic classification over the words and notions of the natural language is discussed. A set of generative grammar rules is offered for generating the semantic classification vector. Examples of the classification application and a theorem of optional formal classification incompleteness are presented. The principles of assigning the meaningful phrases functions over the classification word groups are analyzed.

  4. Agent-based argumentation for ontology alignments

    OpenAIRE

    Laera, Loredana; Tamma, Valentina; Bench-Capon, Trevor; Euzenat, Jérôme

    2006-01-01

    laera2006a International audience When agents communicate they do not necessarily use the same vocabulary or ontology. For them to interact successfully they must find correspondences between the terms used in their ontologies. While many proposals for matching two agent ontologies have been presented in the literature, the resulting alignment may not be satisfactory to both agents and can become the object of further negotiation between them. This paper describes our work constructing ...

  5. Decentralized network management based on mobile agent

    Institute of Scientific and Technical Information of China (English)

    李锋; 冯珊

    2004-01-01

    The mobile agent technology can be employed effectively for the decentralized management of complex networks. We show how the integration of mobile agent with legacy management protocol, such as simple network management protocol (SNMP), leads to decentralized management architecture. HostWatcher is a framework that allows mobile agents to roam network, collect and process data, and perform certain adaptive actions. A prototype system is built and a quantitative analysis underlines the benefits in respect to reducing network load.

  6. Agent-Based Health Monitoring System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose combination of software intelligent agents to achieve decentralized reasoning, with fault detection and diagnosis using PCA, neural nets, and maximum...

  7. Recent advances in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Fujita, Katsuhide; Robu, Valentin

    2016-01-01

    This book covers recent advances in Complex Automated Negotiations as a widely studied emerging area in the field of Autonomous Agents and Multi-Agent Systems. The book includes selected revised and extended papers from the 7th International Workshop on Agent-Based Complex Automated Negotiation (ACAN2014), which was held in Paris, France, in May 2014. The book also includes brief introductions about Agent-based Complex Automated Negotiation which are based on tutorials provided in the workshop, and brief summaries and descriptions about the ANAC'14 (Automated Negotiating Agents Competition) competition, where authors of selected finalist agents explain the strategies and the ideas used by them. The book is targeted to academic and industrial researchers in various communities of autonomous agents and multi-agent systems, such as agreement technology, mechanism design, electronic commerce, related areas, as well as graduate, undergraduate, and PhD students working in those areas or having interest in them.

  8. An Extensible Agent Architecture for a Competitive Market-Based Allocation of Consumer Attention Space

    OpenAIRE

    Hoen, 't, Pieter Jan; Bohte, Sander; Gerding, Enrico; La Poutré, Han

    2002-01-01

    A competitive distributed recommendation mechanism is introduced based on adaptive software agents for efficiently allocating the ``customer attention space'', or banners. In the example of an electronic shopping mall, the task of correctly profiling and analyzing the customers is delegated to the individual shops that operate in a distributed, remote fashion. The evaluation and classification of customers for the bidding on banners is not handled by a central agency as is customary, but is a...

  9. A new circulation type classification based upon Lagrangian air trajectories

    Science.gov (United States)

    Ramos, Alexandre; Sprenger, Michael; Wernli, Heini; Durán-Quesada, Ana María; Lorenzo, Maria Nieves; Gimeno, Luis

    2014-10-01

    A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula) is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories). The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification. A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  10. A new circulation type classification based upon Lagrangian air trajectories

    Directory of Open Access Journals (Sweden)

    Alexandre M. Ramos

    2014-10-01

    Full Text Available A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories. The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification.A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  11. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    Science.gov (United States)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  12. Super pixel density based clustering automatic image classification method

    Science.gov (United States)

    Xu, Mingxing; Zhang, Chuan; Zhang, Tianxu

    2015-12-01

    The image classification is an important means of image segmentation and data mining, how to achieve rapid automated image classification has been the focus of research. In this paper, based on the super pixel density of cluster centers algorithm for automatic image classification and identify outlier. The use of the image pixel location coordinates and gray value computing density and distance, to achieve automatic image classification and outlier extraction. Due to the increased pixel dramatically increase the computational complexity, consider the method of ultra-pixel image preprocessing, divided into a small number of super-pixel sub-blocks after the density and distance calculations, while the design of a normalized density and distance discrimination law, to achieve automatic classification and clustering center selection, whereby the image automatically classify and identify outlier. After a lot of experiments, our method does not require human intervention, can automatically categorize images computing speed than the density clustering algorithm, the image can be effectively automated classification and outlier extraction.

  13. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  14. An Agent-Based Data Mining System for Ontology Evolution

    Science.gov (United States)

    Hadzic, Maja; Dillon, Darshan

    We have developed an evidence-based mental health ontological model that represents mental health in multiple dimensions. The ongoing addition of new mental health knowledge requires a continual update of the Mental Health Ontology. In this paper, we describe how the ontology evolution can be realized using a multi-agent system in combination with data mining algorithms. We use the TICSA methodology to design this multi-agent system which is composed of four different types of agents: Information agent, Data Warehouse agent, Data Mining agents and Ontology agent. We use UML 2.1 sequence diagrams to model the collaborative nature of the agents and a UML 2.1 composite structure diagram to model the structure of individual agents. The Mental Heath Ontology has the potential to underpin various mental health research experiments of a collaborative nature which are greatly needed in times of increasing mental distress and illness.

  15. Agent-based models and individualism: is the world agent-based?

    OpenAIRE

    O'Sullivan, D.; Haklay, M.

    2000-01-01

    Agent-based models (ABMs) are an increasingly popular tool in the social sciences. This trend seems likely to continue, so that they will become widely used in geography and in urban and regional planning. We present an overview of examples of these models in the life sciences, economics, planning, sociology, and archaeology. We conclude that ABMs strongly tend towards an individualist view of the social world. This point is reinforced by closer consideration of particular examples. This disc...

  16. Failure diagnosis using deep belief learning based health state classification

    International Nuclear Information System (INIS)

    Effective health diagnosis provides multifarious benefits such as improved safety, improved reliability and reduced costs for operation and maintenance of complex engineered systems. This paper presents a novel multi-sensor health diagnosis method using deep belief network (DBN). DBN has recently become a popular approach in machine learning for its promised advantages such as fast inference and the ability to encode richer and higher order network structures. The DBN employs a hierarchical structure with multiple stacked restricted Boltzmann machines and works through a layer by layer successive learning process. The proposed multi-sensor health diagnosis methodology using DBN based state classification can be structured in three consecutive stages: first, defining health states and preprocessing sensory data for DBN training and testing; second, developing DBN based classification models for diagnosis of predefined health states; third, validating DBN classification models with testing sensory dataset. Health diagnosis using DBN based health state classification technique is compared with four existing diagnosis techniques. Benchmark classification problems and two engineering health diagnosis applications: aircraft engine health diagnosis and electric power transformer health diagnosis are employed to demonstrate the efficacy of the proposed approach

  17. Mobile Agent Based on Internet%基于Internet的移动Agent

    Institute of Scientific and Technical Information of China (English)

    徐练; 周龙骧; 王翰虎

    2001-01-01

    Mobile Agent is a hybrid of Internet technology and Artificial Intelligence. Today there are tremendous amount of information resources distributing among Internet ,but it's very difficult to find the wanted-thing. Internet has increasingly become a vital compute platform for electron commercial which has highly popular through the world. Developing new Internet-based application programs such as shopping online,e-business,search engine etc pose new task. Mobile Agent proffers new clue and technology. Considering Internet,this thesis conducts a research on architecture,mobile mechanism in mobile Agent system. Based on the Agent theory research and engineering ,the thesis focuses point at researching Mobile Agents,which have the ability to rove through the network. Using OMG's "Mobile Agent Facility Specification" for reference,we design a model architecture of Mobile Agent System. Based on the architecture ,the article analyzes the key technology and gives methods to resolving them ,emphases on mobility mechanism of Agent and implementing it. At last a model of java-based Mobile Agent System is given.

  18. Classification of Gait Types Based on the Duty-factor

    DEFF Research Database (Denmark)

    Fihl, Preben; Moeslund, Thomas B.

    2007-01-01

    This paper deals with classification of human gait types based on the notion that different gait types are in fact different types of locomotion, i.e., running is not simply walking done faster. We present the duty-factor, which is a descriptor based on this notion. The duty-factor is independent...... with known ground support. Silhouettes are extracted using the Codebook method and represented using Shape Contexts. The matching with database silhouettes is done using the Hungarian method. While manually estimated duty-factors show a clear classification the presented system contains...

  19. Cement industry control system based on multi agent

    Institute of Scientific and Technical Information of China (English)

    王海东; 邱冠周; 黄圣生

    2004-01-01

    Cement production is characterized by its great capacity, long-time delay, multi variables, difficult measurement and muhi disturbances. According to the distributed intelligent control strategy based on the multi agent, the multi agent control system of cement production is built, which includes integrated optimal control and diagnosis control. The distributed and multiple level structure of multi agent system for the cement control is studied. The optimal agent is in the distributed state, which aims at the partial process of the cement production, and forms the optimal layer. The diagnosis agent located on the diagnosis layer is the diagnosis unit which aims at the whole process of the cement production, and the central management unit of the system. The system cooperation is realized by the communication among optimal agents and diagnosis agent. The architecture of the optimal agent and the diagnosis agent are designed. The detailed functions of the optimal agent and the diagnosis agent are analyzed.At last the realization methods of the agents are given, and the application of the multi agent control system is presented. The multi agent system has been successfully applied to the off-line control of one cement plant with capacity of 5 000 t/d. The results show that the average yield of the clinker increases 9.3% and the coal consumption decreases 7.5 kg/t.

  20. An Agent Operationalization Approach for Context Specific Agent-Based Modeling

    OpenAIRE

    Christof Knoeri; Binder, Claudia R.; Hans-Joerg Althaus

    2011-01-01

    The potential of agent-based modeling (ABM) has been demonstrated in various research fields. However, three major concerns limit the full exploitation of ABM; (i) agents are too simple and behave unrealistically without any empirical basis, (ii) 'proof of concept' applications are too theoretical and (iii) too much value placed on operational validity instead of conceptual validity. This paper presents an operationalization approach to determine the key system agents, their interaction, deci...

  1. Agent Based Processing of Global Evaluation Function

    CERN Document Server

    Hossain, M Shahriar; Joarder, Md Mahbubul Alam

    2011-01-01

    Load balancing across a networked environment is a monotonous job. Moreover, if the job to be distributed is a constraint satisfying one, the distribution of load demands core intelligence. This paper proposes parallel processing through Global Evaluation Function by means of randomly initialized agents for solving Constraint Satisfaction Problems. A potential issue about the number of agents in a machine under the invocation of distribution is discussed here for securing the maximum benefit from Global Evaluation and parallel processing. The proposed system is compared with typical solution that shows an exclusive outcome supporting the nobility of parallel implementation of Global Evaluation Function with certain number of agents in each invoked machine.

  2. Directional wavelet based features for colonic polyp classification.

    Science.gov (United States)

    Wimmer, Georg; Tamaki, Toru; Tischendorf, J J W; Häfner, Michael; Yoshida, Shigeto; Tanaka, Shinji; Uhl, Andreas

    2016-07-01

    In this work, various wavelet based methods like the discrete wavelet transform, the dual-tree complex wavelet transform, the Gabor wavelet transform, curvelets, contourlets and shearlets are applied for the automated classification of colonic polyps. The methods are tested on 8 HD-endoscopic image databases, where each database is acquired using different imaging modalities (Pentax's i-Scan technology combined with or without staining the mucosa), 2 NBI high-magnification databases and one database with chromoscopy high-magnification images. To evaluate the suitability of the wavelet based methods with respect to the classification of colonic polyps, the classification performances of 3 wavelet transforms and the more recent curvelets, contourlets and shearlets are compared using a common framework. Wavelet transforms were already often and successfully applied to the classification of colonic polyps, whereas curvelets, contourlets and shearlets have not been used for this purpose so far. We apply different feature extraction techniques to extract the information of the subbands of the wavelet based methods. Most of the in total 25 approaches were already published in different texture classification contexts. Thus, the aim is also to assess and compare their classification performance using a common framework. Three of the 25 approaches are novel. These three approaches extract Weibull features from the subbands of curvelets, contourlets and shearlets. Additionally, 5 state-of-the-art non wavelet based methods are applied to our databases so that we can compare their results with those of the wavelet based methods. It turned out that extracting Weibull distribution parameters from the subband coefficients generally leads to high classification results, especially for the dual-tree complex wavelet transform, the Gabor wavelet transform and the Shearlet transform. These three wavelet based transforms in combination with Weibull features even outperform the state

  3. A generic testing framework for agent-based simulation models

    OpenAIRE

    Gürcan, Önder; Dikenelli, Oguz; Bernon, Carole

    2013-01-01

    International audience Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sen...

  4. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  5. Agent-Based Medical Diagnosis Systems

    OpenAIRE

    Barna László Iantovics

    2012-01-01

    Medical diagnostics elaboration many times is a distributed and cooperative work, which involves more medical human specialists and different medical systems. Recent results described in the literature prove that medical diagnosis problems can be solved efficiently by large-scale medical multi-agent systems. Cooperative diagnosing of medical diagnosis problems by large-scale multi-agent systems makes the diagnoses elaborations easier and may increase the accuracy of elaborated diagnostics. Th...

  6. An Interactive Tool for Creating Multi-Agent Systems and Interactive Agent-based Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Utilizing principles from parallel and distributed processing combined with inspiration from modular robotics, we developed the modular interactive tiles. As an educational tool, the modular interactive tiles facilitate the learning of multi-agent systems and interactive agent-based games. The...... modular and physical property of the tiles provides students with hands-on experience in exploring the theoretical aspects underlying multi-agent systems which often appear as challenging to students. By changing the representation of the cognitive challenging aspects of multi-agent systems education to a...

  7. An Interactive Tool for Creating Multi-Agent Systems and Interactive Agent-based Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Utilizing principles from parallel and distributed processing combined with inspiration from modular robotics, we developed the modular interactive tiles. As an educational tool, the modular interactive tiles facilitate the learning of multi-agent systems and interactive agent-based games. The...

  8. Agent Community based Peer-to-Peer Information Retrieval

    Science.gov (United States)

    Mine, Tsunenori; Matsuno, Daisuke; Amamiya, Makoto

    This paper proposes an agent community based information retrieval method, which uses agent communities to manage and look up information related to users. An agent works as a delegate of its user and searches for information that the user wants by communicating with other agents. The communication between agents is carried out in a peer-to-peer computing architecture. In order to retrieve information related to a user query, an agent uses two histories : a query/retrieved document history(Q/RDH) and a query/sender agent history(Q/SAH). The former is a list of pairs of a query and retrieved documents, where the queries were sent by the agent itself. The latter is a list of pairs of a query and sender agents and shows ``who sent what query to the agent''. This is useful to find a new information source. Making use of the Q/SAH is expected to cause a collaborative filtering effect, which gradually creates virtual agent communities, where agents with the same interests stay together. Our hypothesis is that a virtual agent community reduces communication loads to perform a search. As an agent receives more queries, then more links to new knowledge are achieved. From this behavior, a ``give and take''(or positive feedback) effect for agents seems to emerge. We implemented this method with Multi-Agents Kodama which has been developed in our laboratory, and conducted preliminary experiments to test the hypothesis. The empirical results showed that the method was much more efficient than a naive method employing 'broadcast' techniques only to look up a target agent.

  9. Classification and Target Group Selection Based Upon Frequent Patterns

    NARCIS (Netherlands)

    W.H.L.M. Pijls (Wim); R. Potharst (Rob)

    2000-01-01

    textabstractIn this technical report , two new algorithms based upon frequent patterns are proposed. One algorithm is a classification method. The other one is an algorithm for target group selection. In both algorithms, first of all, the collection of frequent patterns in the training set is constr

  10. Time Series Classification by Class-Based Mahalanobis Distances

    CERN Document Server

    Prekopcsák, Zoltán

    2010-01-01

    To classify time series by nearest neighbor, we need to specify or learn a distance. We consider several variations of the Mahalanobis distance and the related Large Margin Nearest Neighbor Classification (LMNN). We find that the conventional Mahalanobis distance is counterproductive. However, both LMNN and the class-based diagonal Mahalanobis distance are competitive.

  11. Classification-Based Method of Linear Multicriteria Optimization

    OpenAIRE

    Vassilev, Vassil; Genova, Krassimira; Vassileva, Mariyana; Narula, Subhash

    2003-01-01

    The paper describes a classification-based learning-oriented interactive method for solving linear multicriteria optimization problems. The method allows the decision makers describe their preferences with greater flexibility, accuracy and reliability. The method is realized in an experimental software system supporting the solution of multicriteria optimization problems.

  12. Hierarchical Real-time Network Traffic Classification Based on ECOC

    Directory of Open Access Journals (Sweden)

    Yaou Zhao

    2013-09-01

    Full Text Available Classification of network traffic is basic and essential for manynetwork researches and managements. With the rapid development ofpeer-to-peer (P2P application using dynamic port disguisingtechniques and encryption to avoid detection, port-based and simplepayload-based network traffic classification methods were diminished.An alternative method based on statistics and machine learning hadattracted researchers' attention in recent years. However, most ofthe proposed algorithms were off-line and usually used a single classifier.In this paper a new hierarchical real-time model was proposed which comprised of a three tuple (source ip, destination ip and destination portlook up table(TT-LUT part and layered milestone part. TT-LUT was used to quickly classify short flows whichneed not to pass the layered milestone part, and milestones in layered milestone partcould classify the other flows in real-time with the real-time feature selection and statistics.Every milestone was a ECOC(Error-Correcting Output Codes based model which was usedto improve classification performance. Experiments showed that the proposedmodel can improve the efficiency of real-time to 80%, and themulti-class classification accuracy encouragingly to 91.4% on the datasets which had been captured from the backbone router in our campus through a week.

  13. Optimizing Mining Association Rules for Artificial Immune System based Classification

    Directory of Open Access Journals (Sweden)

    SAMEER DIXIT

    2011-08-01

    Full Text Available The primary function of a biological immune system is to protect the body from foreign molecules known as antigens. It has great pattern recognition capability that may be used to distinguish between foreigncells entering the body (non-self or antigen and the body cells (self. Immune systems have many characteristics such as uniqueness, autonomous, recognition of foreigners, distributed detection, and noise tolerance . Inspired by biological immune systems, Artificial Immune Systems have emerged during the last decade. They are incited by many researchers to design and build immune-based models for a variety of application domains. Artificial immune systems can be defined as a computational paradigm that is inspired by theoretical immunology, observed immune functions, principles and mechanisms. Association rule mining is one of the most important and well researched techniques of data mining. The goal of association rules is to extract interesting correlations, frequent patterns, associations or casual structures among sets of items in thetransaction databases or other data repositories. Association rules are widely used in various areas such as inventory control, telecommunication networks, intelligent decision making, market analysis and risk management etc. Apriori is the most widely used algorithm for mining the association rules. Other popular association rule mining algorithms are frequent pattern (FP growth, Eclat, dynamic itemset counting (DIC etc. Associative classification uses association rule mining in the rule discovery process to predict the class labels of the data. This technique has shown great promise over many other classification techniques. Associative classification also integrates the process of rule discovery and classification to build the classifier for the purpose of prediction. The main problem with the associative classification approach is the discovery of highquality association rules in a very large space of

  14. TENSOR MODELING BASED FOR AIRBORNE LiDAR DATA CLASSIFICATION

    OpenAIRE

    Li, N.; Liu, C; Pfeifer, N; Yin, J. F.; Liao, Z.Y.; Zhou, Y.

    2016-01-01

    Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the “raw” data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could kee...

  15. Interaction profile-based protein classification of death domain

    Directory of Open Access Journals (Sweden)

    Pio Frederic

    2004-06-01

    Full Text Available Abstract Background The increasing number of protein sequences and 3D structure obtained from genomic initiatives is leading many of us to focus on proteomics, and to dedicate our experimental and computational efforts on the creation and analysis of information derived from 3D structure. In particular, the high-throughput generation of protein-protein interaction data from a few organisms makes such an approach very important towards understanding the molecular recognition that make-up the entire protein-protein interaction network. Since the generation of sequences, and experimental protein-protein interactions increases faster than the 3D structure determination of protein complexes, there is tremendous interest in developing in silico methods that generate such structure for prediction and classification purposes. In this study we focused on classifying protein family members based on their protein-protein interaction distinctiveness. Structure-based classification of protein-protein interfaces has been described initially by Ponstingl et al. 1 and more recently by Valdar et al. 2 and Mintseris et al. 3, from complex structures that have been solved experimentally. However, little has been done on protein classification based on the prediction of protein-protein complexes obtained from homology modeling and docking simulation. Results We have developed an in silico classification system entitled HODOCO (Homology modeling, Docking and Classification Oracle, in which protein Residue Potential Interaction Profiles (RPIPS are used to summarize protein-protein interaction characteristics. This system applied to a dataset of 64 proteins of the death domain superfamily was used to classify each member into its proper subfamily. Two classification methods were attempted, heuristic and support vector machine learning. Both methods were tested with a 5-fold cross-validation. The heuristic approach yielded a 61% average accuracy, while the machine

  16. Pulse frequency classification based on BP neural network

    Institute of Scientific and Technical Information of China (English)

    WANG Rui; WANG Xu; YANG Dan; FU Rong

    2006-01-01

    In Traditional Chinese Medicine (TCM), it is an important parameter of the clinic disease diagnosis to analysis the pulse frequency. This article accords to pulse eight major essentials to identify pulse type of the pulse frequency classification based on back-propagation neural networks (BPNN). The pulse frequency classification includes slow pulse, moderate pulse, rapid pulse etc. By feature parameter of the pulse frequency analysis research and establish to identify system of pulse frequency features. The pulse signal from detecting system extracts period, frequency etc feature parameter to compare with standard feature value of pulse type. The result shows that identify-rate attains 92.5% above.

  17. Classification of CT-brain slices based on local histograms

    Science.gov (United States)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  18. AN EFFICIENT CLASSIFICATION OF GENOMES BASED ON CLASSES AND SUBCLASSES

    Directory of Open Access Journals (Sweden)

    B.V. DHANDRA,

    2010-08-01

    Full Text Available The grass family has been the subject of intense research over the past. Reliable and fast classification / sub-classification of large sequences which are rapidly gaining importance due to genome sequencing projects all over the world is contributing large amount of genome sequences to public gene bank . Hence sequence classification has gained importance for predicting the genome function, structure, evolutionary relationships and also gives the insight into the features associated with the biological role of the class. Thus, classification of functional genome is an important andchallenging task to both computer scientists and biologists. The presence of motifs in grass genome chains predicts the functional behavior of the grass genome. The correlation between grass genome properties and their motifs is not always obvious since more than one motif may exist within a genome chain. Due to the complexity of this association most of the data mining algorithms are either non efficient or time consuming. Hence, in this paper we proposed an efficient method for main classes based on classes to reduce the time complexity for the classification of large sequences of grass genomes dataset. The proposed approaches classify the given dataset into classes with conserved threshold and again reclassify the class relaxed threshold into major classes. Experimental results indicate that the proposed method reduces the time complexity keepingclassification accuracy level as that compared with general NNCalgorithm.

  19. Online Network Traffic Classification Algorithm Based on RVM

    Directory of Open Access Journals (Sweden)

    Zhang Qunhui

    2013-06-01

    Full Text Available Since compared with the Support Vector Machine (SVM, the Relevance Vector Machine (RVM not only has the advantage of avoiding the over- learn which is the characteristic of the SVM, but also greatly reduces the amount of computation of the kernel function and avoids the defects of the SVM that the scarcity is not strong, the large amount of calculation as well as the kernel function must satisfy the Mercer's condition and that human empirically determined parameters, so we proposed a new online traffic classification algorithm base on the RVM for this purpose. Through the analysis of the basic principles of RVM and the steps of the modeling, we made use of the training traffic classification model of the RVM to identify the network traffic in the real time through this model and the “port number+ DPI”. When the RVM predicts that the probability is in the query interval, we jointly used the "port number" and "DPI". Finally, we made a detailed experimental validation which shows that: compared with the Support Vector Machine (SVM network traffic classification algorithm, this algorithm can achieve the online network traffic classification, and the classification predication probability is greatly improved.

  20. Modeling collective emotions: a stochastic approach based on Brownian agents

    International Nuclear Information System (INIS)

    We develop a agent-based framework to model the emergence of collective emotions, which is applied to online communities. Agents individual emotions are described by their valence and arousal. Using the concept of Brownian agents, these variables change according to a stochastic dynamics, which also considers the feedback from online communication. Agents generate emotional information, which is stored and distributed in a field modeling the online medium. This field affects the emotional states of agents in a non-linear manner. We derive conditions for the emergence of collective emotions, observable in a bimodal valence distribution. Dependent on a saturated or a super linear feedback between the information field and the agent's arousal, we further identify scenarios where collective emotions only appear once or in a repeated manner. The analytical results are illustrated by agent-based computer simulations. Our framework provides testable hypotheses about the emergence of collective emotions, which can be verified by data from online communities. (author)

  1. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  2. The Gap of Current Agent Based Simulation Modeling Practices and Feasibility of a Generic Agent Based Simulation Model

    OpenAIRE

    Yim Ling Loo; Alicia Y.C. Tang; Azhana Ahmad

    2015-01-01

    Agent-based modeling had been revolving to be established approach in modeling simulation systems which are used to understand and predict certain real-life scenarios in specific domains. Past researches which are domain-specific caused repetitive building of new models from scratch and restrict replication and reuse because of limitation of models’ description. This paper presents a review of gaps between domain-specific agent-based simulation modeling and the recent practices of agent-based...

  3. Torrent classification - Base of rational management of erosive regions

    Energy Technology Data Exchange (ETDEWEB)

    Gavrilovic, Zoran; Stefanovic, Milutin; Milovanovic, Irina; Cotric, Jelena; Milojevic, Mileta [Institute for the Development of Water Resources ' Jaroslav Cerni' , 11226 Beograd (Pinosava), Jaroslava Cernog 80 (Serbia)], E-mail: gavrilovicz@sbb.rs

    2008-11-01

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  4. Torrent classification - Base of rational management of erosive regions

    International Nuclear Information System (INIS)

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  5. Volatility clustering in agent based market models

    Science.gov (United States)

    Giardina, Irene; Bouchaud, Jean-Philippe

    2003-06-01

    We define and study a market model, where agents have different strategies among which they can choose, according to their relative profitability, with the possibility of not participating to the market. The price is updated according to the excess demand, and the wealth of the agents is properly accounted for. Only two parameters play a significant role: one describes the impact of trading on the price, and the other describes the propensity of agents to be trend following or contrarian. We observe three different regimes, depending on the value of these two parameters: an oscillating phase with bubbles and crashes, an intermittent phase and a stable ‘rational’ market phase. The statistics of price changes in the intermittent phase resembles that of real price changes, with small linear correlations, fat tails and long-range volatility clustering. We discuss how the time dependence of these two parameters spontaneously drives the system in the intermittent region.

  6. Multi-agent based cooperative search in combinatorial optimisation

    OpenAIRE

    Martin, Simon

    2013-01-01

    Cooperative search provides a class of strategies to design more effective search methodologies by combining (meta-) heuristics for solving combinatorial optimisation problems. This area has been little explored in operational research. This thesis proposes a general agent-based distributed framework where each agent implements a (meta-) heuristic. An agent continuously adapts itself during the search process using a cooperation protocol based on reinforcement learning and pattern matching. G...

  7. Multi-Agent Reinforcement Learning Algorithm Based on Action Prediction

    Institute of Scientific and Technical Information of China (English)

    TONG Liang; LU Ji-lian

    2006-01-01

    Multi-agent reinforcement learning algorithms are studied. A prediction-based multi-agent reinforcement learning algorithm is presented for multi-robot cooperation task. The multi-robot cooperation experiment based on multi-agent inverted pendulum is made to test the efficency of the new algorithm, and the experiment results show that the new algorithm can achieve the cooperation strategy much faster than the primitive multiagent reinforcement learning algorithm.

  8. Social scientists, qualitative data, and agent-based modeling

    OpenAIRE

    Seidl, Roman

    2014-01-01

    Empirical data obtained with social science methods can be useful for informing agent-based models, for instance, to fix the profile of heterogeneous agents or to specify behavioral rules. For the latter in particular, qualitative methods that investigate the details of individual decision processes are an option. In this paper, I highlight the challenges for social scientists who investigate social/psychological phenomena but at the same time have to consider the properties of agent-based si...

  9. Ad Hoc Protocols Via Multi Agent Based Tools

    OpenAIRE

    Ali Bazghandi; Mehdi Bazghandi

    2011-01-01

    The purpose of this paper is investigating behaviors of Ad Hoc protocols in Agent-based simulation environments. First we bring brief introduction about agents and Ad Hoc networks. We introduce some agent-based simulation tools like NS-2. Then we focus on two protocols, which are Ad Hoc On-demand Multipath Distance Vector (AODV) and Destination Sequenced Distance Vector (DSDV). At the end, we bring simulation results and discuss about their reasons.

  10. Container Terminal Operations Modeling through Multi agent based Simulation

    OpenAIRE

    Ayub, Yasir; Faruki, Usman

    2009-01-01

    This thesis aims to propose a multi-agent based hierarchical model for the operations of container terminals. We have divided our model into four key agents that are involved in each sub processes. The proposed agent allocation policies are recommended for different situations that may occur at a container terminal. A software prototype is developed which implements the hierarchical model. This web based application is used in order to simulate the various processes involved in the following ...

  11. Fast rule-based bioactivity prediction using associative classification mining

    Directory of Open Access Journals (Sweden)

    Yu Pulan

    2012-11-01

    Full Text Available Abstract Relating chemical features to bioactivities is critical in molecular design and is used extensively in the lead discovery and optimization process. A variety of techniques from statistics, data mining and machine learning have been applied to this process. In this study, we utilize a collection of methods, called associative classification mining (ACM, which are popular in the data mining community, but so far have not been applied widely in cheminformatics. More specifically, classification based on predictive association rules (CPAR, classification based on multiple association rules (CMAR and classification based on association rules (CBA are employed on three datasets using various descriptor sets. Experimental evaluations on anti-tuberculosis (antiTB, mutagenicity and hERG (the human Ether-a-go-go-Related Gene blocker datasets show that these three methods are computationally scalable and appropriate for high speed mining. Additionally, they provide comparable accuracy and efficiency to the commonly used Bayesian and support vector machines (SVM methods, and produce highly interpretable models.

  12. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  13. MOBILE BUSINESS APPROACH BASED ON MOBILE AGENT

    Directory of Open Access Journals (Sweden)

    Ahmed Aloui

    2011-01-01

    Full Text Available Users today want the opportunity to make (or manage a businesses in anytime and anywhere via their mobile devices. This paper proposes the architecture with mobile agent for the mobile businesses (m-business. M-business appeared as the promising approach to drive the vague following one of electronic business (e-business. Most of the e-busines [9] applications uses the traditional model client/server in which a commercial operation requires generally a link of stable communication being established between the customer and the server, and the traditional approach client/server [8] constitutes an obstacle to the development of application of m-business. The proposed architecture introduces several advantages: in the first place, allow the consumers to manage their commercial business driven by types of mobile devices (phones, PDAs, etc. .... at any time and wherever. Secondly, minimize the waiting time of the customer, and the quantity of transferring information. Third, this architecture addresses the problem of time limited and expensive connection for mobile users. The Mobile agents will be used on a single level: research agent. Every research mobile agent will be used to visit the target server site of the application to collect the information’s for his client, which allows it to interact locally with a server, and so to reduce the traffic on the network by transmitting only the useful data.

  14. Agent Types and Structures based on Analysis of Building Design

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    Based on an anaysis of building design an initial division of design agent into five classes: information collectors, generators, modifiers amd evaluators is presented.......Based on an anaysis of building design an initial division of design agent into five classes: information collectors, generators, modifiers amd evaluators is presented....

  15. Collective Machine Learning: Team Learning and Classification in Multi-Agent Systems

    Science.gov (United States)

    Gifford, Christopher M.

    2009-01-01

    This dissertation focuses on the collaboration of multiple heterogeneous, intelligent agents (hardware or software) which collaborate to learn a task and are capable of sharing knowledge. The concept of collaborative learning in multi-agent and multi-robot systems is largely under studied, and represents an area where further research is needed to…

  16. Detection/classification/quantification of chemical agents using an array of surface acoustic wave (SAW) devices

    Science.gov (United States)

    Milner, G. Martin

    2005-05-01

    ChemSentry is a portable system used to detect, identify, and quantify chemical warfare (CW) agents. Electro chemical (EC) cell sensor technology is used for blood agents and an array of surface acoustic wave (SAW) sensors is used for nerve and blister agents. The combination of the EC cell and the SAW array provides sufficient sensor information to detect, classify and quantify all CW agents of concern using smaller, lighter, lower cost units. Initial development of the SAW array and processing was a key challenge for ChemSentry requiring several years of fundamental testing of polymers and coating methods to finalize the sensor array design in 2001. Following the finalization of the SAW array, nearly three (3) years of intensive testing in both laboratory and field environments were required in order to gather sufficient data to fully understand the response characteristics. Virtually unbounded permutations of agent characteristics and environmental characteristics must be considered in order to operate against all agents and all environments of interest to the U.S. military and other potential users of ChemSentry. The resulting signal processing design matched to this extensive body of measured data (over 8,000 agent challenges and 10,000 hours of ambient data) is considered to be a significant advance in state-of-the-art for CW agent detection.

  17. Classification of Regional Ionospheric Disturbances Based on Support Vector Machines

    Science.gov (United States)

    Begüm Terzi, Merve; Arikan, Feza; Arikan, Orhan; Karatay, Secil

    2016-07-01

    Ionosphere is an anisotropic, inhomogeneous, time varying and spatio-temporally dispersive medium whose parameters can be estimated almost always by using indirect measurements. Geomagnetic, gravitational, solar or seismic activities cause variations of ionosphere at various spatial and temporal scales. This complex spatio-temporal variability is challenging to be identified due to extensive scales in period, duration, amplitude and frequency of disturbances. Since geomagnetic and solar indices such as Disturbance storm time (Dst), F10.7 solar flux, Sun Spot Number (SSN), Auroral Electrojet (AE), Kp and W-index provide information about variability on a global scale, identification and classification of regional disturbances poses a challenge. The main aim of this study is to classify the regional effects of global geomagnetic storms and classify them according to their risk levels. For this purpose, Total Electron Content (TEC) estimated from GPS receivers, which is one of the major parameters of ionosphere, will be used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. In this work, for the automated classification of the regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. SVM is a supervised learning model used for classification with associated learning algorithm that analyze the data and recognize patterns. In addition to performing linear classification, SVM can efficiently perform nonlinear classification by embedding data into higher dimensional feature spaces. Performance of the developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from the GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing the developed classification

  18. Upper limit for context based crop classification

    DEFF Research Database (Denmark)

    Midtiby, Henrik; Åstrand, Björn; Jørgensen, Rasmus Nyholm;

    2012-01-01

    Mechanical in-row weed control of crops like sugarbeet require precise knowledge of where individual crop plants are located. If crop plants are placed in known pattern, information about plant locations can be used to discriminate between crop and weed plants. The success rate of such a classifier...... depends on the weed pressure, the position uncertainty of the crop plants and the crop upgrowth percentage. The first two measures can be combined to a normalized weed pressure, \\lambda. Given the normalized weed pressure an upper bound on the positive predictive value is shown to be 1/(1+\\lambda). If the...... weed pressure is \\rho = 400/m^2 and the crop position uncertainty is \\sigma_x = 0.0148m along the row and \\sigma_y = 0.0108m perpendicular to the row, the normalized weed pressure is \\lambda ~ 0.40$; the upper bound on the positive predictive value is then 0.71. This means that when a position based...

  19. Object-Based Classification and Change Detection of Hokkaido, Japan

    Science.gov (United States)

    Park, J. G.; Harada, I.; Kwak, Y.

    2016-06-01

    Topography and geology are factors to characterize the distribution of natural vegetation. Topographic contour is particularly influential on the living conditions of plants such as soil moisture, sunlight, and windiness. Vegetation associations having similar characteristics are present in locations having similar topographic conditions unless natural disturbances such as landslides and forest fires or artificial disturbances such as deforestation and man-made plantation bring about changes in such conditions. We developed a vegetation map of Japan using an object-based segmentation approach with topographic information (elevation, slope, slope direction) that is closely related to the distribution of vegetation. The results found that the object-based classification is more effective to produce a vegetation map than the pixel-based classification.

  20. Agent-Based Decentralized Control Method for Islanded Microgrids

    DEFF Research Database (Denmark)

    Li, Qiang; Chen, Feixiong; Chen, Minyou;

    2016-01-01

    In this paper, an agent-based decentralized control model for islanded microgrids is proposed, which consists of a two-layer control structure. The bottom layer is the electrical distribution microgrid, while the top layer is the communication network composed of agents. An agent is regarded...... is processed according to control laws, agents adjust the production of distributed generators to which they connect. The main contributions of this paper are (i) an agent-based model for decentralized secondary control is introduced and the rules to establish the communication network are given; (ii...... agents use the proposed control laws. Finally, the simulation results show that frequency and voltage fluctuations are small and meet the requirements....

  1. Multi-Agent Based PGP Architecture

    Directory of Open Access Journals (Sweden)

    Babak Nouri-Moghaddam

    2014-03-01

    Full Text Available Pretty Good Privacy (PGP is a package for securing emails, files communications. It is an open-source package, which is available online for users. PGP provides some of the most important security services like Authentication, Confidentiality, and Integrity. PGP Also applies compression techniques for compressing messages and reducing their size. Also it uses Radix-64 encoding/decoding scheme for email compatibility. The classic PGP has been formed by independent components and uses a hierarchal structure in which each component is responsible for providing one of the services or features in PGP. This hierarchal structure forces all the components, even the independent ones to be executed in a linear way. Because of this structure, each component waits idle for long a time. As a result, the classic PGP has low performance and high execution time. By studying this structure, we find out that we can redesign the architecture by using Multi-Agent systems to eliminate bottlenecks. With this new design, we can achieve higher performance and faster execution time than the classic PGP. In the proposed scheme, each Agent handles one of the PGP's components and in the implementation semaphores will be used to handle each agent. By using this technique, we will have concurrency between the agents and as a result the idle time will decrease and the proposed scheme will get higher performance and lower execution time than the classic PGP. The experimental results show that our scheme runs 30% faster than the classic PGP with different configurations of computer hardware.

  2. Networks based on collisions among mobile agents

    CERN Document Server

    Gonz'alez, M C; Herrmann, H J; Gonz\\'alez, Marta C.; Lind, Pedro G.; Herrmann, Hans J.

    2006-01-01

    We investigate in detail a recent model of colliding mobile agents [Phys. Rev. Lett.~96, 088702], used as an alternative approach to construct evolving networks of interactions formed by the collisions governed by suitable dynamical rules. The system of mobile agents evolves towards a quasi-stationary state which is, apart small fluctuations, well characterized by the density of the system and the residence time of the agents. The residence time defines a collision rate and by varying the collision rate, the system percolates at a critical value, with the emergence of a giant cluster whose critical exponents are the ones of two-dimensional percolation. Further, the degree and clustering coefficient distributions and the average path length show that the network associated with such a system presents non-trivial features which, depending on the collision rule, enables one not only to recover the main properties of standard networks, such as exponential, random and scale-free networks, but also to obtain other ...

  3. A multi-agent based Tourism Kiosk on Internet

    OpenAIRE

    Yeung, CSK; Tung, PF; Yen, JCH

    1998-01-01

    We discuss the implementation of a multi agent based Tourism Kiosk for Hong Kong tourism industry on the Internet. This system allows the users to retrieve the most updated information about Hong Kong through any Java enabled Web browser. The complete system consists of a set of software agents who handle various information categories, such as hotels, shopping centres, and cinemas, etc. The Knowledge Query and Manipulation Language (KQML) was selected as the agent communication language to d...

  4. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo;

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can be mod...... effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  5. Agent-based transportation planning compared with scheduling heuristics

    OpenAIRE

    Mes, MRK Martijn; Heijden, van der, T.G.C.; Harten, van, W.H.

    2004-01-01

    Here we consider the problem of dynamically assigning vehicles to transportation orders that have di¤erent time windows and should be handled in real time. We introduce a new agent-based system for the planning and scheduling of these transportation networks. Intelligent vehicle agents schedule their own routes. They interact with job agents, who strive for minimum transportation costs, using a Vickrey auction for each incoming order. We use simulation to compare the on-time delivery percenta...

  6. Metagenome fragment classification based on multiple motif-occurrence profiles

    Directory of Open Access Journals (Sweden)

    Naoki Matsushita

    2014-09-01

    Full Text Available A vast amount of metagenomic data has been obtained by extracting multiple genomes simultaneously from microbial communities, including genomes from uncultivable microbes. By analyzing these metagenomic data, novel microbes are discovered and new microbial functions are elucidated. The first step in analyzing these data is sequenced-read classification into reference genomes from which each read can be derived. The Naïve Bayes Classifier is a method for this classification. To identify the derivation of the reads, this method calculates a score based on the occurrence of a DNA sequence motif in each reference genome. However, large differences in the sizes of the reference genomes can bias the scoring of the reads. This bias might cause erroneous classification and decrease the classification accuracy. To address this issue, we have updated the Naïve Bayes Classifier method using multiple sets of occurrence profiles for each reference genome by normalizing the genome sizes, dividing each genome sequence into a set of subsequences of similar length and generating profiles for each subsequence. This multiple profile strategy improves the accuracy of the results generated by the Naïve Bayes Classifier method for simulated and Sargasso Sea datasets.

  7. Comparison Of Power Quality Disturbances Classification Based On Neural Network

    Directory of Open Access Journals (Sweden)

    Nway Nway Kyaw Win

    2015-07-01

    Full Text Available Abstract Power quality disturbances PQDs result serious problems in the reliability safety and economy of power system network. In order to improve electric power quality events the detection and classification of PQDs must be made type of transient fault. Software analysis of wavelet transform with multiresolution analysis MRA algorithm and feed forward neural network probabilistic and multilayer feed forward neural network based methodology for automatic classification of eight types of PQ signals flicker harmonics sag swell impulse fluctuation notch and oscillatory will be presented. The wavelet family Db4 is chosen in this system to calculate the values of detailed energy distributions as input features for classification because it can perform well in detecting and localizing various types of PQ disturbances. This technique classifies the types of PQDs problem sevents.The classifiers classify and identify the disturbance type according to the energy distribution. The results show that the PNN can analyze different power disturbance types efficiently. Therefore it can be seen that PNN has better classification accuracy than MLFF.

  8. Tutorial on agent-based modeling and simulation. Part 2 : how to model with agents.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2006-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of interacting autonomous agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to do research. Some have gone so far as to contend that ABMS is a new way of doing science. Computational advances make possible a growing number of agent-based applications across many fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling the growth and decline of ancient civilizations to modeling the complexities of the human immune system, and many more. This tutorial describes the foundations of ABMS, identifies ABMS toolkits and development methods illustrated through a supply chain example, and provides thoughts on the appropriate contexts for ABMS versus conventional modeling techniques.

  9. An AERONET-based aerosol classification using the Mahalanobis distance

    Science.gov (United States)

    Hamill, Patrick; Giordano, Marco; Ward, Carolyne; Giles, David; Holben, Brent

    2016-09-01

    We present an aerosol classification based on AERONET aerosol data from 1993 to 2012. We used the AERONET Level 2.0 almucantar aerosol retrieval products to define several reference aerosol clusters which are characteristic of the following general aerosol types: Urban-Industrial, Biomass Burning, Mixed Aerosol, Dust, and Maritime. The classification of a particular aerosol observation as one of these aerosol types is determined by its five-dimensional Mahalanobis distance to each reference cluster. We have calculated the fractional aerosol type distribution at 190 AERONET sites, as well as the monthly variation in aerosol type at those locations. The results are presented on a global map and individually in the supplementary material. Our aerosol typing is based on recognizing that different geographic regions exhibit characteristic aerosol types. To generate reference clusters we only keep data points that lie within a Mahalanobis distance of 2 from the centroid. Our aerosol characterization is based on the AERONET retrieved quantities, therefore it does not include low optical depth values. The analysis is based on "point sources" (the AERONET sites) rather than globally distributed values. The classifications obtained will be useful in interpreting aerosol retrievals from satellite borne instruments.

  10. Medical Processes Agent-Based Critiquing System

    Czech Academy of Sciences Publication Activity Database

    Bošanský, Branislav

    Praha : Ústav informatiky AV ČR, v. v. i. & MATFYZPRESS, 2009 - (Kuželová, D.), 5-11 ISBN 978-80-7378-087-6. [Doktorandské dny 2009 Ústavu informatiky AV ČR, v. v. i.. Jizerka (CZ), 21.09.2009-23.09.2009] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : medical processes * multi-agent systems * decosion support systems Subject RIV: IN - Informatics, Computer Science

  11. Towards an agent-oriented programming language based on Scala

    Science.gov (United States)

    Mitrović, Dejan; Ivanović, Mirjana; Budimac, Zoran

    2012-09-01

    Scala and its multi-threaded model based on actors represent an excellent framework for developing purely reactive agents. This paper presents an early research on extending Scala with declarative programming constructs, which would result in a new agent-oriented programming language suitable for developing more advanced, BDI agent architectures. The main advantage the new language over many other existing solutions for programming BDI agents is a natural and straightforward integration of imperative and declarative programming constructs, fitted under a single development framework.

  12. Migration control for mobile agents based on passport and visa

    OpenAIRE

    Guan, Su; T. Wang; Ong, SH

    2003-01-01

    Research on mobile agents has attracted much attention as this paradigm has demonstrated great potential for the next-generation e-commerce. Proper solutions to security-related problems become key factors in the successful deployment of mobile agents in e-commerce systems. We propose the use of passport and visa (P/V) for securing mobile agent migration across communities based on the SAFER e-commerce framework. P/V not only serves as up-to-date digital credentials for agent-host authentica...

  13. Active Dictionary Learning in Sparse Representation Based Classification

    OpenAIRE

    Xu, Jin; He, Haibo; Man, Hong

    2014-01-01

    Sparse representation, which uses dictionary atoms to reconstruct input vectors, has been studied intensively in recent years. A proper dictionary is a key for the success of sparse representation. In this paper, an active dictionary learning (ADL) method is introduced, in which classification error and reconstruction error are considered as the active learning criteria in selection of the atoms for dictionary construction. The learned dictionaries are caculated in sparse representation based...

  14. Understanding Acupuncture Based on ZHENG Classification from System Perspective

    OpenAIRE

    Junwei Fang; Ningning Zheng; Yang Wang; Huijuan Cao; Shujun Sun; Jianye Dai; Qianhua Li; Yongyu Zhang

    2013-01-01

    Acupuncture is an efficient therapy method originated in ancient China, the study of which based on ZHENG classification is a systematic research on understanding its complexity. The system perspective is contributed to understand the essence of phenomena, and, as the coming of the system biology era, broader technology platforms such as omics technologies were established for the objective study of traditional chinese medicine (TCM). Omics technologies could dynamically determine molecular c...

  15. BCI Signal Classification using a Riemannian-based kernel

    OpenAIRE

    Barachant, Alexandre; Bonnet, Stéphane; Congedo, Marco; Jutten, Christian

    2012-01-01

    The use of spatial covariance matrix as feature is investigated for motor imagery EEG-based classification. A new kernel is derived by establishing a connection with the Riemannian geometry of symmetric positive definite matrices. Different kernels are tested, in combination with support vector machines, on a past BCI competition dataset. We demonstrate that this new approach outperforms significantly state of the art results without the need for spatial filtering.

  16. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    Hany Nashat Gabra

    2015-06-01

    Full Text Available Intrusion detection systems (IDSs have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  17. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    OpenAIRE

    Hany Nashat Gabra; Bahaa-Eldin, Ayman M.; Hoda Korashy Mohammed

    2015-01-01

    Intrusion detection systems (IDSs) have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  18. Data Mining Based Technique for IDS Alerts Classification

    OpenAIRE

    Gabra, Hany N.; Bahaa-Eldin, Ayman M.; Mohamed, Hoda K.

    2012-01-01

    Intrusion detection systems (IDSs) have become a widely used measure for security systems. The main problem for those systems results is the irrelevant alerts on those results. We will propose a data mining based method for classification to distinguish serious alerts and irrelevant one with a performance of 99.9% which is better in comparison with the other recent data mining methods that have reached the performance of 97%. A ranked alerts list also created according to alerts importance to...

  19. Classification of objects in images based on various object representations

    OpenAIRE

    Cichocki, Radoslaw

    2006-01-01

    Object recognition is a hugely researched domain that employs methods derived from mathematics, physics and biology. This thesis combines the approaches for object classification that base on two features – color and shape. Color is represented by color histograms and shape by skeletal graphs. Four hybrids are proposed which combine those approaches in different manners and the hybrids are then tested to find out which of them gives best results.

  20. A Cluster Based Approach for Classification of Web Results

    OpenAIRE

    Apeksha Khabia; M. B. Chandak

    2014-01-01

    Nowadays significant amount of information from web is present in the form of text, e.g., reviews, forum postings, blogs, news articles, email messages, web pages. It becomes difficult to classify documents in predefined categories as the number of document grows. Clustering is the classification of a data into clusters, so that the data in each cluster share some common trait – often vicinity according to some defined measure. Underlying distribution of data set can somewhat be depicted base...

  1. Complexity in Simplicity: Flexible Agent-based State Space Exploration

    DEFF Research Database (Denmark)

    Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2007-01-01

    In this paper, we describe a new flexible framework for state space exploration based on cooperating agents. The idea is to let various agents with different search patterns explore the state space individually and communicate information about fruitful subpaths of the search tree to each other...

  2. Agent-Based Modeling: A Powerful Tool for Tourism Researchers

    NARCIS (Netherlands)

    Nicholls, Sarah; Amelung, B.; Student, Jillian

    2016-01-01

    Agent-based modeling (ABM) is a way of representing complex systems of autonomous agents or actors, and of simulating the multiple potential outcomes of these agents’ behaviors and interactions in the form of a range of alternatives or futures. Despite the complexity of the tourism system, and the p

  3. Complex between lignin and a Ti-based coupling agent

    DEFF Research Database (Denmark)

    Rasmussen, Jonas Stensgaard; Barsberg, Søren Talbro; Felby, Claus

    2014-01-01

    coating formulations would have a better performance if the adhesion to wood could be improved. In the present work, the chemical interaction between a titanium-based coupling agent, isopropyl triisostearoyl titanate (titanium agent, TA) and lignin has been studied by means of attenuated total reflectance...... between TA and lignin....

  4. Expected energy-based restricted Boltzmann machine for classification.

    Science.gov (United States)

    Elfwing, S; Uchibe, E; Doya, K

    2015-04-01

    In classification tasks, restricted Boltzmann machines (RBMs) have predominantly been used in the first stage, either as feature extractors or to provide initialization of neural networks. In this study, we propose a discriminative learning approach to provide a self-contained RBM method for classification, inspired by free-energy based function approximation (FE-RBM), originally proposed for reinforcement learning. For classification, the FE-RBM method computes the output for an input vector and a class vector by the negative free energy of an RBM. Learning is achieved by stochastic gradient-descent using a mean-squared error training objective. In an earlier study, we demonstrated that the performance and the robustness of FE-RBM function approximation can be improved by scaling the free energy by a constant that is related to the size of network. In this study, we propose that the learning performance of RBM function approximation can be further improved by computing the output by the negative expected energy (EE-RBM), instead of the negative free energy. To create a deep learning architecture, we stack several RBMs on top of each other. We also connect the class nodes to all hidden layers to try to improve the performance even further. We validate the classification performance of EE-RBM using the MNIST data set and the NORB data set, achieving competitive performance compared with other classifiers such as standard neural networks, deep belief networks, classification RBMs, and support vector machines. The purpose of using the NORB data set is to demonstrate that EE-RBM with binary input nodes can achieve high performance in the continuous input domain. PMID:25318375

  5. An agent oriented information system: an MDA based development

    Directory of Open Access Journals (Sweden)

    Mohamed Sadgal

    2012-09-01

    Full Text Available Information systems (IS development should not only accomplish functional models but also conceptual models to represent the organizational environment in which it will have to evolve and must be aligned with strategic objectives. Generally, a significant innovations in the enterprise, is to organize its IS around its business processes. Otherwise, business models must be enriched by the agent paradigm to reduce the complexity involved in solving a problem by the structuring of knowledge on a set of intelligent agents, the association between agents and activities and collaboration among agents. To do this, we propose an agent oriented approach based on the model-driven-architecture (MDA for the information system development. This approach uses in its different phases, the BPMN language for the business processes modeling, AML language for the agent modeling, and JADEX platform for the implementation. The IS development is realized by different automated mappings from source models to target models.

  6. Design and implementation based on the classification protection vulnerability scanning system

    International Nuclear Information System (INIS)

    With the application and spread of the classification protection, Network Security Vulnerability Scanning should consider the efficiency and the function expansion. It proposes a kind of a system vulnerability from classification protection, and elaborates the design and implementation of a vulnerability scanning system based on vulnerability classification plug-in technology and oriented classification protection. According to the experiment, the application of classification protection has good adaptability and salability with the system, and it also approves the efficiency of scanning. (authors)

  7. MODEL-BASED PERFORMANCE EVALUATION APPROACH FOR MOBILE AGENT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Li Xin; Mi Zhengkun; Meng Xudong

    2004-01-01

    Claimed as the next generation programming paradigm, mobile agent technology has attracted extensive interests in recent years. However, up to now, limited research efforts have been devoted to the performance study of mobile agent system and most of these researches focus on agent behavior analysis resulting in that models are hard to apply to mobile agent systems. To bridge the gap, a new performance evaluation model derived from operation mechanisms of mobile agent platforms is proposed. Details are discussed for the design of companion simulation software, which can provide the system performance such as response time of platform to mobile agent. Further investigation is followed on the determination of model parameters. Finally comparison is made between the model-based simulation results and measurement-based real performance of mobile agent systems. The results show that the proposed model and designed software are effective in evaluating performance characteristics of mobile agent systems. The proposed approach can also be considered as the basis of performance analysis for large systems composed of multiple mobile agent platforms.

  8. Content Based Image Retrieval with Mobile Agents and Steganography

    OpenAIRE

    Thampi, Sabu M.; Sekaran, K. Chandra

    2004-01-01

    In this paper we present an image retrieval system based on Gabor texture features, steganography, and mobile agents.. By employing the information hiding technique, the image attributes can be hidden in an image without degrading the image quality. Thus the image retrieval process becomes simple. Java based mobile agents manage the query phase of the system. Based on the simulation results, the proposed system not only shows the efficiency in hiding the attributes but also provides other adv...

  9. A new gammagraphic and functional-based classification for hyperthyroidism

    International Nuclear Information System (INIS)

    The absence of an universal classification for hyperthyroidism's (HT), give rise to inadequate interpretation of series and trials, and prevents decision making. We offer a tentative classification based on gammagraphic and functional findings. Clinical records from patients who underwent thyroidectomy in our Department since 1967 to 1997 were reviewed. Those with functional measurements of hyperthyroidism were considered. All were managed according to the same preestablished guidelines. HT was the surgical indication in 694 (27,1%) of the 2559 thyroidectomy. Based on gammagraphic studies, we classified HTs in: parenchymatous increased-uptake, which could be diffuse, diffuse with cold nodules or diffuse with at least one nodule, and nodular increased-uptake (Autonomous Functioning Thyroid Nodes-AFTN), divided into solitary AFTN or toxic adenoma and multiple AFTN o toxic multi-nodular goiter. This gammagraphic-based classification in useful and has high sensitivity to detect these nodules assessing their activity, allowing us to make therapeutic decision making and, in some cases, to choose surgical technique. (authors)

  10. Multi-agent Based Charges subsystem for Supply Chain Logistics

    Directory of Open Access Journals (Sweden)

    Pankaj Rani

    2012-05-01

    Full Text Available The main objective of this paper is to design charges subsystem using multi agent technology which deals with calculation, accrual and collection of various charges levied at the goods in a supply chain Logistics. Accrual of various charges such as freight, demurrage, and wharfage take place implicitly in the SC system at the various events of different subsystems which is collected and calculated by software agents. An Agent-based modeling is an approach based on the idea that a system is composed of decentralized individual ‘agents’ and that each agent interacts with other agents according to its localized knowledge. Our aim is to design a flexible architecture that can deal with next generation supply chain problems based on a multi-agent architecture. In this article, a multi agent system has been developed to calculate charges levied at various stages on good sheds.. Each entity is modeled as one agent and their coordination lead to control inventories and minimize the total cost of SC by sharing information and forecasting knowledge and using negotiation mechanism.

  11. Resource-efficient wireless monitoring based on mobile agent migration

    Science.gov (United States)

    Smarsly, Kay; Law, Kincho H.; König, Markus

    2011-04-01

    Wireless sensor networks are increasingly adopted in many engineering applications such as environmental and structural monitoring. Having proven to be low-cost, easy to install and accurate, wireless sensor networks serve as a powerful alternative to traditional tethered monitoring systems. However, due to the limited resources of a wireless sensor node, critical problems are the power-consuming transmission of the collected sensor data and the usage of onboard memory of the sensor nodes. This paper presents a new approach towards resource-efficient wireless sensor networks based on a multi-agent paradigm. In order to efficiently use the restricted computing resources, software agents are embedded in the wireless sensor nodes. On-board agents are designed to autonomously collect, analyze and condense the data sets using relatively simple yet resource-efficient algorithms. If having detected (potential) anomalies in the observed structural system, the on-board agents explicitly request specialized software agents. These specialized agents physically migrate from connected computer systems, or adjacent nodes, to the respective sensor node in order to perform more complex damage detection analyses based on their inherent expert knowledge. A prototype system is designed and implemented, deploying multi-agent technology and dynamic code migration, in a wireless sensor network for structural health monitoring. Laboratory tests are conducted to validate the performance of the agent-based wireless structural health monitoring system and to verify its autonomous damage detection capabilities.

  12. Agent-based Simulation of the Maritime Domain

    Directory of Open Access Journals (Sweden)

    O. Vaněk

    2010-01-01

    Full Text Available In this paper, a multi-agent based simulation platform is introduced that focuses on legitimate and illegitimate aspects of maritime traffic, mainly on intercontinental transport through piracy afflicted areas. The extensible architecture presented here comprises several modules controlling the simulation and the life-cycle of the agents, analyzing the simulation output and visualizing the entire simulated domain. The simulation control module is initialized by various configuration scenarios to simulate various real-world situations, such as a pirate ambush, coordinated transit through a transport corridor, or coastal fishing and local traffic. The environmental model provides a rich set of inputs for agents that use the geo-spatial data and the vessel operational characteristics for their reasoning. The agent behavior model based on finite state machines together with planning algorithms allows complex expression of agent behavior, so the resulting simulation output can serve as a substitution for real world data from the maritime domain.

  13. Changing Histopathological Diagnostics by Genome-Based Tumor Classification

    Directory of Open Access Journals (Sweden)

    Michael Kloth

    2014-05-01

    Full Text Available Traditionally, tumors are classified by histopathological criteria, i.e., based on their specific morphological appearances. Consequently, current therapeutic decisions in oncology are strongly influenced by histology rather than underlying molecular or genomic aberrations. The increase of information on molecular changes however, enabled by the Human Genome Project and the International Cancer Genome Consortium as well as the manifold advances in molecular biology and high-throughput sequencing techniques, inaugurated the integration of genomic information into disease classification. Furthermore, in some cases it became evident that former classifications needed major revision and adaption. Such adaptations are often required by understanding the pathogenesis of a disease from a specific molecular alteration, using this molecular driver for targeted and highly effective therapies. Altogether, reclassifications should lead to higher information content of the underlying diagnoses, reflecting their molecular pathogenesis and resulting in optimized and individual therapeutic decisions. The objective of this article is to summarize some particularly important examples of genome-based classification approaches and associated therapeutic concepts. In addition to reviewing disease specific markers, we focus on potentially therapeutic or predictive markers and the relevance of molecular diagnostics in disease monitoring.

  14. A Fuzzy Similarity Based Concept Mining Model for Text Classification

    CERN Document Server

    Puri, Shalini

    2012-01-01

    Text Classification is a challenging and a red hot field in the current scenario and has great importance in text categorization applications. A lot of research work has been done in this field but there is a need to categorize a collection of text documents into mutually exclusive categories by extracting the concepts or features using supervised learning paradigm and different classification algorithms. In this paper, a new Fuzzy Similarity Based Concept Mining Model (FSCMM) is proposed to classify a set of text documents into pre - defined Category Groups (CG) by providing them training and preparing on the sentence, document and integrated corpora levels along with feature reduction, ambiguity removal on each level to achieve high system performance. Fuzzy Feature Category Similarity Analyzer (FFCSA) is used to analyze each extracted feature of Integrated Corpora Feature Vector (ICFV) with the corresponding categories or classes. This model uses Support Vector Machine Classifier (SVMC) to classify correct...

  15. SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Thiruvengatanadhan Ramalingam

    2014-01-01

    Full Text Available Audio classification serves as the fundamental step towards the rapid growth in audio data volume. Due to the increasing size of the multimedia sources speech and music classification is one of the most important issues for multimedia information retrieval. In this work a speech/music discrimination system is developed which utilizes the Discrete Wavelet Transform (DWT as the acoustic feature. Multi resolution analysis is the most significant statistical way to extract the features from the input signal and in this study, a method is deployed to model the extracted wavelet feature. Support Vector Machines (SVM are based on the principle of structural risk minimization. SVM is applied to classify audio into their classes namely speech and music, by learning from training data. Then the proposed method extends the application of Gaussian Mixture Models (GMM to estimate the probability density function using maximum likelihood decision methods. The system shows significant results with an accuracy of 94.5%.

  16. Agent-based method for distributed clustering of textual information

    Science.gov (United States)

    Potok, Thomas E. [Oak Ridge, TN; Reed, Joel W. [Knoxville, TN; Elmore, Mark T. [Oak Ridge, TN; Treadwell, Jim N. [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  17. Information Fusion Using Ontology-Based Communication between Agents

    Directory of Open Access Journals (Sweden)

    Tarek Sobh

    2009-06-01

    Full Text Available The distribution of on-line applications among network nodes may require obtaining acceptable results from data analysis of multiple sensors. Such sensors data is probably heterogeneous, inconsistent, and of different types. Therefore, multiple sensor data fusion is required. Here, there are many levels of information fusion (from low level signals to high level knowledge. Agents for monitoring application field events could be used to dynamically react to those events and to take appropriate actions. In a dynamic environment even a single agent may have varying capabilities to sense that environment. The situation becomes more complex when various heterogeneous agents need to communicate with each other. Ontologies offer significant benefits to multi-agent systems. The benefits as such are interoperability, reusability, support for multi-agent systems development activities such as system analysis and agent knowledge modeling. Ontologies support multi-agent systems operations such as agent communication and reasoning. The proposed agent based model in this paper can afford a promising model for obtaining acceptable information in case of multiple sensors.

  18. Histological image classification using biologically interpretable shape-based features

    International Nuclear Information System (INIS)

    Automatic cancer diagnostic systems based on histological image classification are important for improving therapeutic decisions. Previous studies propose textural and morphological features for such systems. These features capture patterns in histological images that are useful for both cancer grading and subtyping. However, because many of these features lack a clear biological interpretation, pathologists may be reluctant to adopt these features for clinical diagnosis. We examine the utility of biologically interpretable shape-based features for classification of histological renal tumor images. Using Fourier shape descriptors, we extract shape-based features that capture the distribution of stain-enhanced cellular and tissue structures in each image and evaluate these features using a multi-class prediction model. We compare the predictive performance of the shape-based diagnostic model to that of traditional models, i.e., using textural, morphological and topological features. The shape-based model, with an average accuracy of 77%, outperforms or complements traditional models. We identify the most informative shapes for each renal tumor subtype from the top-selected features. Results suggest that these shapes are not only accurate diagnostic features, but also correlate with known biological characteristics of renal tumors. Shape-based analysis of histological renal tumor images accurately classifies disease subtypes and reveals biologically insightful discriminatory features. This method for shape-based analysis can be extended to other histological datasets to aid pathologists in diagnostic and therapeutic decisions

  19. Chitosan-based formulations of drugs, imaging agents and biotherapeutics

    NARCIS (Netherlands)

    Amidi, M.; Hennink, W.E.

    2010-01-01

    This preface is part of the Advanced Drug Delivery Reviews theme issue on “Chitosan-Based Formulations of Drugs, Imaging Agents and Biotherapeutics”. This special Advanced Drug Delivery Reviews issue summarizes recent progress and different applications of chitosanbased formulations.

  20. Agent-Based Modeling of Growth Processes

    Science.gov (United States)

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  1. Agent-Based Collaborative Traffic Flow Management Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose agent-based game-theoretic approaches for simulation of strategies involved in multi-objective collaborative traffic flow management (CTFM). Intelligent...

  2. SICS MarketSpace: an agent-based market infrastructure

    OpenAIRE

    Eriksson, Joakim; Finne, Niclas; Janson, Sverker

    1998-01-01

    We present a simple and uniform communication framework for an agent-based market infrastructure, the goal of which is to enable automation of markets with self-interested participants distributed over the Internet.

  3. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  4. Pivotal Technology Research of Grid Based on Mobile Agent

    Institute of Scientific and Technical Information of China (English)

    CHEN Hong-wei; WANG Ru-chuan

    2004-01-01

    Grid Based on Mobile Agent is a new grid scheme. The purpose of the paper is to solve the pivotal technology of Grid Based on Mobile Agent ( GBMA) combined with thought of Virtual Organization ( VO). In GBMA, virtual organization is viewed as the basic management unit of the grid, and mobile agent is regarded as an important interactive means. Grid architecture, grid resource management and grid task management are the core technology problem of GBMA. The simulation results show that Inter- VO pattern has the obvious advantage because it can make full use of resources from other virtual organizations in GBMA environment.

  5. The fractional volatility model: An agent-based interpretation

    Science.gov (United States)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  6. Scalable, distributed data mining using an agent based architecture

    Energy Technology Data Exchange (ETDEWEB)

    Kargupta, H.; Hamzaoglu, I.; Stafford, B.

    1997-05-01

    Algorithm scalability and the distributed nature of both data and computation deserve serious attention in the context of data mining. This paper presents PADMA (PArallel Data Mining Agents), a parallel agent based system, that makes an effort to address these issues. PADMA contains modules for (1) parallel data accessing operations, (2) parallel hierarchical clustering, and (3) web-based data visualization. This paper describes the general architecture of PADMA and experimental results.

  7. Agent-based Models for Economic Policy Design

    OpenAIRE

    Dawid, Herbert; Neugart, Michael

    2010-01-01

    Agent-based simulation models are used by an increasing number of scholars as a tool for providing evaluations of economic policy measures and policy recommendations in complex environments. On the basis of recent work in this area we discuss the advantages of agent-based modeling for economic policy design and identify further needs to be addressed for strengthening this methodological approach as a basis for sound policy advice.

  8. The Promises and Perils of Agent-Based Computational Economics

    OpenAIRE

    Matteo Richiardi

    2004-01-01

    In this paper I analyse the main strengths and weaknesses of agent-based computational models. I first describe how agent-based simulations can complement more traditional modelling techniques. Then, I rationalise the main theoretical critiques against the use of simulation, which point to the following problematic areas: (i) interpretation of the simulation dynamics, (ii) estimation of the simulation model, and (iii) generalisation of the results. I show that there exist solutions for all th...

  9. Network Traffic Anomalies Identification Based on Classification Methods

    Directory of Open Access Journals (Sweden)

    Donatas Račys

    2015-07-01

    Full Text Available A problem of network traffic anomalies detection in the computer networks is analyzed. Overview of anomalies detection methods is given then advantages and disadvantages of the different methods are analyzed. Model for the traffic anomalies detection was developed based on IBM SPSS Modeler and is used to analyze SNMP data of the router. Investigation of the traffic anomalies was done using three classification methods and different sets of the learning data. Based on the results of investigation it was determined that C5.1 decision tree method has the largest accuracy and performance and can be successfully used for identification of the network traffic anomalies.

  10. ACO Agent Based Routing in AOMDV Environment

    Directory of Open Access Journals (Sweden)

    Kaur Amanpreet

    2016-01-01

    Full Text Available Mobile Ad-hoc Network (MANET is a group of moving nodes which can communicate with each other without the help of any central stationary node. All the nodes in the MANET act as router for forwarding data packets. The nodes in the network also move randomly and there exists no fixed infrastructure. So, path breaks are the frequent problem in MANET. The routing protocol faces a lot of problem due these path breaks. Therefore, the routing protocol which is multipath in nature is more reliable than a unipath routing protocol. Ant colony optimization is a relatively new technique which is suitable for the optimization problems. AOMDV is a multipath routing protocol. Thus, if there happens to be path break, the packets can start following the new path which has already been selected. In this paper, we are trying to add ant’s agents into AOMDV behavior. In this way, the new protocol will be benefited by the dual properties i.e. of ant’s nature and multipath nature of AOMDV. The modified concept is simulated and the outcomes are compared with AOMDV, AODV and DSR routing protocols for few performance parameters. Results obtained are encouraging; the new algorithm performs better than traditional unipath and multipath routing protocols.

  11. Spectral classification of stars based on LAMOST spectra

    CERN Document Server

    Liu, Chao; Zhang, Bo; Wan, Jun-Chen; Deng, Li-Cai; Hou, Yonghui; Wang, Yuefei; Yang, Ming; Zhang, Yong

    2015-01-01

    In this work, we select the high signal-to-noise ratio spectra of stars from the LAMOST data andmap theirMK classes to the spectral features. The equivalentwidths of the prominent spectral lines, playing the similar role as the multi-color photometry, form a clean stellar locus well ordered by MK classes. The advantage of the stellar locus in line indices is that it gives a natural and continuous classification of stars consistent with either the broadly used MK classes or the stellar astrophysical parameters. We also employ a SVM-based classification algorithm to assignMK classes to the LAMOST stellar spectra. We find that the completenesses of the classification are up to 90% for A and G type stars, while it is down to about 50% for OB and K type stars. About 40% of the OB and K type stars are mis-classified as A and G type stars, respectively. This is likely owe to the difference of the spectral features between the late B type and early A type stars or between the late G and early K type stars are very we...

  12. Risk Classification and Risk-based Safety and Mission Assurance

    Science.gov (United States)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  13. MODEL-BASED CLUSTERING FOR CLASSIFICATION OF AQUATIC SYSTEMS AND DIAGNOSIS OF ECOLOGICAL STRESS

    Science.gov (United States)

    Clustering approaches were developed using the classification likelihood, the mixture likelihood, and also using a randomization approach with a model index. Using a clustering approach based on the mixture and classification likelihoods, we have developed an algorithm that...

  14. Content-based image retrieval applied to BI-RADS tissue classification in screening mammography

    OpenAIRE

    2011-01-01

    AIM: To present a content-based image retrieval (CBIR) system that supports the classification of breast tissue density and can be used in the processing chain to adapt parameters for lesion segmentation and classification.

  15. Autonomous Traffic Control System Using Agent Based Technology

    CERN Document Server

    M, Venkatesh; V, Srinivas

    2011-01-01

    The way of analyzing, designing and building of real-time projects has been changed due to the rapid growth of internet, mobile technologies and intelligent applications. Most of these applications are intelligent, tiny and distributed components called as agent. Agent works like it takes the input from numerous real-time sources and gives back the real-time response. In this paper how these agents can be implemented in vehicle traffic management especially in large cities and identifying various challenges when there is a rapid growth of population and vehicles. In this paper our proposal gives a solution for using autonomous or agent based technology. These autonomous or intelligent agents have the capability to observe, act and learn from their past experience. This system uses the knowledge flow of precedent signal or data to identify the incoming flow of forthcoming signal. Our architecture involves the video analysis and exploration using some Intelligence learning algorithm to estimate and identify the...

  16. UML MODELING AND SYSTEM ARCHITECTURE FOR AGENT BASED INFORMATION RETRIEVAL

    Directory of Open Access Journals (Sweden)

    D. Muhammad Noorul Mubarak

    2015-12-01

    Full Text Available In this current technological era, there is an enormous increase in the information available on web and also in the online databases. This information abundance increases the complexity of finding relevant information. To solve such challenges, there is a need for improved and intelligent systems for efficient search and retrieval. Intelligent Agents can be used for better search and information retrieval in a document collection. The information required by a user is scattered in a large number of databases. In this paper, the object oriented modeling for agent based information retrieval system is presented. The paper also discusses the framework of agent architecture for obtaining the best combination terms that serve as an input query to the information retrieval system. The communication and cooperation among the agents are also explained. Each agent has a task to perform in information retrieval.

  17. A Hybrid Classification Approach based on FCA and Emerging Patterns - An application for the classification of biological inhibitors

    OpenAIRE

    Asses, Yasmine; Buzmakov, Aleksey; Bourquard, Thomas; Kuznetsov, Sergei O.; Napoli, Amedeo

    2012-01-01

    Classification is an important task in data analysis and learning. Classification can be performed using supervised or unsupervised methods. From the unsupervised point of view, Formal Concept Analysis (FCA) can be used for such a task in an efficient and well-founded way. From the supervised point of view, emerging patterns rely on pattern mining and can be used to characterize classes of objects w.r.t. a priori labels. In this paper, we present a hybrid classification method which is based ...

  18. Agent-Based Approaches for Behavioural Modelling in Military Simulations

    Directory of Open Access Journals (Sweden)

    Gaurav Chaudhary

    2015-12-01

    Full Text Available Behavioral modeling of combat entities in military simulations by creating synthetic agents in order to satisfy various battle scenarios is an important problem. The conventional modeling tools are not always sufficient to handle complex situations requiring adaptation. To deal with this Agent-Based Modeling (ABM is employed, as the agents exhibit autonomous behavior by adapting and varying their behavior during the course of the simulation whilst achieving the goals. Synthetic agents created by means of Computer Generated Force (CGF is a relatively recent approach to model behavior of combat entities for a more realistic training and effective military planning. CGFs, are also sometimes referred to as Semi- Automated Forces (SAF and enables to create high-fidelity simulations. Agents are used to control and augment the behavior of CGF entities, hence converting them into Intelligent CGF (ICGF. The intelligent agents can be modeled to exhibit cognitive abilities. For this review paper, extensive papers on stateof-the-art in agent-based modeling approaches and applications were surveyed. The paper assimilates issues involved in ABM with CGF as an important component of it. It reviews modeling aspects with respect to the interrelationship between ABM and CGF, which is required to carry out behavioral modeling. Important CGFs have been examined and a list with their significant features is given. Another issue that has been reviewed is that how the synthetic agents having different capabilities are implemented at different battle levels. Brief mention of state-of-the-art integrated cognitive architectures and a list of significant cognitive applications based on them with their features is given. At the same time, the maturity of ABM in agent-based applications has also been considered.

  19. Performance verification of a LIF-LIDAR technique for stand-off detection and classification of biological agents

    Science.gov (United States)

    Wojtanowski, Jacek; Zygmunt, Marek; Muzal, Michał; Knysak, Piotr; Młodzianko, Andrzej; Gawlikowski, Andrzej; Drozd, Tadeusz; Kopczyński, Krzysztof; Mierczyk, Zygmunt; Kaszczuk, Mirosława; Traczyk, Maciej; Gietka, Andrzej; Piotrowski, Wiesław; Jakubaszek, Marcin; Ostrowski, Roman

    2015-04-01

    LIF (laser-induced fluorescence) LIDAR (light detection and ranging) is one of the very few promising methods in terms of long-range stand-off detection of air-borne biological particles. A limited classification of the detected material also appears as a feasible asset. We present the design details and hardware setup of the developed range-resolved multichannel LIF-LIDAR system. The device is based on two pulsed UV laser sources operating at 355 nm and 266 nm wavelength (3rd and 4th harmonic of Nd:YAG, Q-switched solid-state laser, respectively). Range-resolved fluorescence signals are collected in 28 channels of compound PMT sensor coupled with Czerny-Turner spectrograph. The calculated theoretical sensitivities are confronted with the results obtained during measurement field campaign. Classification efforts based on 28-digit fluorescence spectral signatures linear processing are also presented.

  20. A Chemistry-Based Classification for Peridotite Xenoliths

    Science.gov (United States)

    Block, K. A.; Ducea, M.; Raye, U.; Stern, R. J.; Anthony, E. Y.; Lehnert, K. A.

    2007-12-01

    The development of a petrological and geochemical database for mantle xenoliths is important for interpreting EarthScope geophysical results. Interpretation of compositional characteristics of xenoliths requires a sound basis for comparing geochemical results, even when no petrographic modes are available. Peridotite xenoliths are generally classified on the basis of mineralogy (Streckeisen, 1973) derived from point-counting methods. Modal estimates, particularly on heterogeneous samples, are conducted using various methodologies and are therefore subject to large statistical error. Also, many studies simply do not report the modes. Other classifications for peridotite xenoliths based on host matrix or tectonic setting (cratonic vs. non-cratonic) are poorly defined and provide little information on where samples from transitional settings fit within a classification scheme (e.g., xenoliths from circum-cratonic locations). We present here a classification for peridotite xenoliths based on bulk rock major element chemistry, which is one of the most common types of data reported in the literature. A chemical dataset of over 1150 peridotite xenoliths is compiled from two online geochemistry databases, the EarthChem Deep Lithosphere Dataset and from GEOROC (http://www.earthchem.org), and is downloaded with the rock names reported in the original publications. Ternary plots of combinations of the SiO2- CaO-Al2O3-MgO (SCAM) components display sharp boundaries that define the dunite, harzburgite, lherzolite, or wehrlite-pyroxenite fields and provide a graphical basis for classification. In addition, for the CaO-Al2O3-MgO (CAM) diagram, a boundary between harzburgite and lherzolite at approximately 19% CaO is defined by a plot of over 160 abyssal peridotite compositions calculated from observed modes using the methods of Asimow (1999) and Baker and Beckett (1999). We anticipate that our SCAM classification is a first step in the development of a uniform basis for

  1. Content Based Image Retrieval : Classification Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Shereena V.B

    2014-11-01

    Full Text Available In a content-based image retrieval system (CBIR, the main issue is to extract the image features that effectively represent the image contents in a database. Such an extraction requires a detailed evaluation of retrieval performance of image features. This paper presents a review of fundamental aspects of content based image retrieval including feature extraction of color and texture features. Commonly used color features including color moments, color histogram and color correlogram and Gabor texture are compared. The paper reviews the increase in efficiency of image retrieval when the color and texture features are combined. The similarity measures based on which matches are made and images are retrieved are also discussed. For effective indexing and fast searching of images based on visual features, neural network based pattern learning can be used to achieve effective classification.

  2. Content Based Image Retrieval : Classification Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Shereena V.B

    2014-10-01

    Full Text Available In a content-based image retrieval system (CBIR, the main issue is to extract the image features that effectively represent the image contents in a database. Such an extraction requires a detailed evaluation of retrieval performance of image features. This paper presents a review of fundamental aspects of content based image retrieval including feature extraction of color and texture features. Commonly used color features including color moments, color histogram and color correlogram and Gabor texture are compared. The paper reviews the increase in efficiency of image retrieval when the color and texture features are combined. The similarity measures based on which matches are made and images are retrieved are also discussed. For effective indexing and fast searching of images based on visual features, neural network based pattern learning can be used to achieve effective classification.

  3. Intrusion Awareness Based on Data Fusion and SVM Classification

    Directory of Open Access Journals (Sweden)

    Ramnaresh Sharma

    2012-06-01

    Full Text Available Network intrusion awareness is important factor forrisk analysis of network security. In the currentdecade various method and framework are availablefor intrusion detection and security awareness.Some method based on knowledge discovery processand some framework based on neural network.These entire model take rule based decision for thegeneration of security alerts. In this paper weproposed a novel method for intrusion awarenessusing data fusion and SVM classification. Datafusion work on the biases of features gathering ofevent. Support vector machine is super classifier ofdata. Here we used SVM for the detection of closeditem of ruled based technique. Our proposedmethod simulate on KDD1999 DARPA data set andget better empirical evaluation result in comparisonof rule based technique and neural network model.

  4. Intrusion Awareness Based on Data Fusion and SVM Classification

    Directory of Open Access Journals (Sweden)

    Ramnaresh Sharma

    2012-06-01

    Full Text Available Network intrusion awareness is important factor for risk analysis of network security. In the current decade various method and framework are available for intrusion detection and security awareness. Some method based on knowledge discovery process and some framework based on neural network. These entire model take rule based decision for the generation of security alerts. In this paper we proposed a novel method for intrusion awareness using data fusion and SVM classification. Data fusion work on the biases of features gathering of event. Support vector machine is super classifier of data. Here we used SVM for the detection of closed item of ruled based technique. Our proposed method simulate on KDD1999 DARPA data set and get better empirical evaluation result in comparison of rule based technique and neural network model.

  5. A Rough Sets-based Agent Trust Management Framework

    Directory of Open Access Journals (Sweden)

    Sadra Abedinzadeh

    2013-03-01

    Full Text Available In a virtual society, which consists of several autonomous agents, trust helps agents to deal with the openness of the system by identifying the best agents capable of performing a specific task, or achieving a special goal. In this paper, we introduce ROSTAM, a new approach for agent trust management based on the theory of Rough Sets. ROSTAM is a generic trust management framework that can be applied to any types of multi agent systems. However, the features of the application domain must be provided to ROSTAM. These features form the trust attributes. By collecting the values for these attributes, ROSTAM is able to generate a set of trust rules by employing the theory of Rough Sets. ROSTAM then uses the trust rules to extract the set of the most trusted agents and forwards the user’s request to those agents only. After getting the results, the user must rate the interaction with each trusted agent. The rating values are subsequently utilized for updating the trust rules. We applied ROSTAM to the domain of cross-language Web search. The resulting Web search system recommends to the user the set of the most trusted pairs of translator and search engine in terms of the pairs that return the results with the highest precision of retrieval.

  6. Texton Based Shape Features on Local Binary Pattern for Age Classification

    OpenAIRE

    V. Vijaya Kumar; B. Eswara Reddy; P. Chandra Sekhar Reddy

    2012-01-01

    Classification and recognition of objects is interest of many researchers. Shape is a significant feature of objects and it plays a crucial role in image classification and recognition. The present paper assumes that the features that drastically affect the adulthood classification system are the Shape features (SF) of face. Based on this, the present paper proposes a new technique of adulthood classification by extracting feature parameters of face on Integrated Texton based LBP (IT-LBP) ima...

  7. The agent-based spatial information semantic grid

    Science.gov (United States)

    Cui, Wei; Zhu, YaQiong; Zhou, Yong; Li, Deren

    2006-10-01

    Analyzing the characteristic of multi-Agent and geographic Ontology, The concept of the Agent-based Spatial Information Semantic Grid (ASISG) is defined and the architecture of the ASISG is advanced. ASISG is composed with Multi-Agents and geographic Ontology. The Multi-Agent Systems are composed with User Agents, General Ontology Agent, Geo-Agents, Broker Agents, Resource Agents, Spatial Data Analysis Agents, Spatial Data Access Agents, Task Execution Agent and Monitor Agent. The architecture of ASISG have three layers, they are the fabric layer, the grid management layer and the application layer. The fabric layer what is composed with Data Access Agent, Resource Agent and Geo-Agent encapsulates the data of spatial information system so that exhibits a conceptual interface for the Grid management layer. The Grid management layer, which is composed with General Ontology Agent, Task Execution Agent and Monitor Agent and Data Analysis Agent, used a hybrid method to manage all resources that were registered in a General Ontology Agent that is described by a General Ontology System. The hybrid method is assembled by resource dissemination and resource discovery. The resource dissemination push resource from Local Ontology Agent to General Ontology Agent and the resource discovery pull resource from the General Ontology Agent to Local Ontology Agents. The Local Ontology Agent is derived from special domain and describes the semantic information of local GIS. The nature of the Local Ontology Agents can be filtrated to construct a virtual organization what could provides a global scheme. The virtual organization lightens the burdens of guests because they need not search information site by site manually. The application layer what is composed with User Agent, Geo-Agent and Task Execution Agent can apply a corresponding interface to a domain user. The functions that ASISG should provide are: 1) It integrates different spatial information systems on the semantic The Grid

  8. S1 gene-based phylogeny of infectious bronchitis virus: An attempt to harmonize virus classification.

    Science.gov (United States)

    Valastro, Viviana; Holmes, Edward C; Britton, Paul; Fusaro, Alice; Jackwood, Mark W; Cattoli, Giovanni; Monne, Isabella

    2016-04-01

    Infectious bronchitis virus (IBV) is the causative agent of a highly contagious disease that results in severe economic losses to the global poultry industry. The virus exists in a wide variety of genetically distinct viral types, and both phylogenetic analysis and measures of pairwise similarity among nucleotide or amino acid sequences have been used to classify IBV strains. However, there is currently no consensus on the method by which IBV sequences should be compared, and heterogeneous genetic group designations that are inconsistent with phylogenetic history have been adopted, leading to the confusing coexistence of multiple genotyping schemes. Herein, we propose a simple and repeatable phylogeny-based classification system combined with an unambiguous and rationale lineage nomenclature for the assignment of IBV strains. By using complete nucleotide sequences of the S1 gene we determined the phylogenetic structure of IBV, which in turn allowed us to define 6 genotypes that together comprise 32 distinct viral lineages and a number of inter-lineage recombinants. Because of extensive rate variation among IBVs, we suggest that the inference of phylogenetic relationships alone represents a more appropriate criterion for sequence classification than pairwise sequence comparisons. The adoption of an internationally accepted viral nomenclature is crucial for future studies of IBV epidemiology and evolution, and the classification scheme presented here can be updated and revised novel S1 sequences should become available. PMID:26883378

  9. Agent-Based Urban Land Markets: Agent's Pricing Behavior, Land Prices and Urban Land Use Change

    NARCIS (Netherlands)

    Filatova, Tatiana; Parker, Dawn; Veen, van der Anne

    2009-01-01

    We present a new bilateral agent-based land market model, which moves beyond previous work by explicitly modeling behavioral drivers of land-market transactions on both the buyer and seller sides; formation of bid prices (of buyers) and ask prices (of sellers); and the relative division of the gains

  10. The Development of Sugar-Based Anti-Melanogenic Agents.

    Science.gov (United States)

    Bin, Bum-Ho; Kim, Sung Tae; Bhin, Jinhyuk; Lee, Tae Ryong; Cho, Eun-Gyung

    2016-01-01

    The regulation of melanin production is important for managing skin darkness and hyperpigmentary disorders. Numerous anti-melanogenic agents that target tyrosinase activity/stability, melanosome maturation/transfer, or melanogenesis-related signaling pathways have been developed. As a rate-limiting enzyme in melanogenesis, tyrosinase has been the most attractive target, but tyrosinase-targeted treatments still pose serious potential risks, indicating the necessity of developing lower-risk anti-melanogenic agents. Sugars are ubiquitous natural compounds found in humans and other organisms. Here, we review the recent advances in research on the roles of sugars and sugar-related agents in melanogenesis and in the development of sugar-based anti-melanogenic agents. The proposed mechanisms of action of these agents include: (a) (natural sugars) disturbing proper melanosome maturation by inducing osmotic stress and inhibiting the PI3 kinase pathway and (b) (sugar derivatives) inhibiting tyrosinase maturation by blocking N-glycosylation. Finally, we propose an alternative strategy for developing anti-melanogenic sugars that theoretically reduce melanosomal pH by inhibiting a sucrose transporter and reduce tyrosinase activity by inhibiting copper incorporation into an active site. These studies provide evidence of the utility of sugar-based anti-melanogenic agents in managing skin darkness and curing pigmentary disorders and suggest a future direction for the development of physiologically favorable anti-melanogenic agents. PMID:27092497

  11. Agent-based services for B2B electronic commerce

    Science.gov (United States)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  12. Reliability of Service-Based and Agent-Based Systems

    OpenAIRE

    Huhns, Michael N.

    2010-01-01

    A description of the current problems of service-oriented architectures and service-oriented computing and how the solutions will come from using agent technology. That is, services will have to become more agent-like in order to succeed fully in the marketplace.

  13. Classification des signaux EGC avec un système-multi-agent neuronale

    OpenAIRE

    BELGACEM, Amar

    2012-01-01

    Le signal ECG représente l’activité électrique du coeur et reflète l’état de santé de l’appareil cardiovasculaire. Il contient aussi des informations qui permettent la distinction des maladies cardiovasculaires. Le taux élevé de mortalité dans le monde dû aux problèmes liés au dysfonctionnement de l’appareil cardiaque a poussé les chercheurs à développer des techniques de classification automatique des maladies cardiovasculaires pour un bon diagnostic. Le travail dans ce mémoire présente un e...

  14. Fuzzy Motivations in a Multiple Agent Behaviour-Based Architecture

    Directory of Open Access Journals (Sweden)

    Tomás V. Arredondo

    2013-08-01

    Full Text Available In this article we introduce a blackboard- based multiple agent system framework that considers biologically-based motivations as a means to develop a user friendly interface. The framework includes a population-based heuristic as well as a fuzzy logic- based inference system used toward scoring system behaviours. The heuristic provides an optimization environment and the fuzzy scoring mechanism is used to give a fitness score to possible system outputs (i.e. solutions. This framework results in the generation of complex behaviours which respond to previously specified motivations. Our multiple agent blackboard and motivation-based framework is validated in a low cost mobile robot specifically built for this task. The robot was used in several navigation experiments and the motivation profile that was considered included "curiosity", "homing", "energy" and "missions". Our results show that this motivation-based approach permits a low cost multiple agent-based autonomous mobile robot to acquire a diverse set of fit behaviours that respond well to user and performance expectations. These results also validate our multiple agent framework as an incremental, flexible and practical method for the development of robust multiple agent systems.

  15. QoS Negotiation and Renegotiation Based on Mobile Agents

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shi-bing; ZHANG Deng-yin

    2006-01-01

    The Quality of Service (QoS) has received more and more attention since QoS becomes increasingly important in the Internet development. Mobile software agents represent a valid alternative to the implementation of strategies for the negotiation. In this paper, a QoS negotiation and renegotiation system architecture based on mobile agents is proposed. The agents perform the task in the whole process. Therefore, such a system can reduce the network load, overcome latency, and avoid frequent exchange information between clients and server. The simulation results show that the proposed system could improve the network resource utility about 10%.

  16. Emergent Macroeconomics An Agent-Based Approach to Business Fluctuations

    CERN Document Server

    Delli Gatti, Domenico; Gallegati, Mauro; Giulioni, Gianfranco; Palestrini, Antonio

    2008-01-01

    This book contributes substantively to the current state-of-the-art of macroeconomics by providing a method for building models in which business cycles and economic growth emerge from the interactions of a large number of heterogeneous agents. Drawing from recent advances in agent-based computational modeling, the authors show how insights from dispersed fields like the microeconomics of capital market imperfections, industrial dynamics and the theory of stochastic processes can be fruitfully combined to improve our understanding of macroeconomic dynamics. This book should be a valuable resource for all researchers interested in analyzing macroeconomic issues without recurring to a fictitious representative agent.

  17. Agent-based computational economics using NetLogo

    CERN Document Server

    Damaceanu, Romulus-Catalin

    2013-01-01

    Agent-based Computational Economics using NetLogo explores how researchers can create, use and implement multi-agent computational models in Economics by using NetLogo software platform. Problems of economic science can be solved using multi-agent modelling (MAM). This technique uses a computer model to simulate the actions and interactions of autonomous entities in a network, in order to analyze the effects on the entire economic system. MAM combines elements of game theory, complex systems, emergence and evolutionary programming. The Monte Carlo method is also used in this e-book to introduc

  18. Tutorial on agent-based modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  19. A Multiagent Recommender System with Task-Based Agent Specialization

    Science.gov (United States)

    Lorenzi, Fabiana; Correa, Fabio Arreguy Camargo; Bazzan, Ana L. C.; Abel, Mara; Ricci, Francesco

    This paper describes a multiagent recommender system where agents maintain local knowledge bases and, when requested to support a travel planning task, they collaborate exchanging information stored in their local bases. A request for a travel recommendation is decomposed by the system into sub tasks, corresponding to travel services. Agents select tasks autonomously, and accomplish them with the help of the knowledge derived from previous solutions. In the proposed architecture, agents become experts in some task types, and this makes the recommendation generation more efficient. In this paper, we validate the model via simulations where agents collaborate to recommend a travel package to the user. The experiments show that specialization is useful hence providing a validation of the proposed model.

  20. An Agent-Based Modeling for Pandemic Influenza in Egypt

    CERN Document Server

    Khalil, Khaled M; Nazmy, Taymour T; Salem, Abdel-Badeeh M

    2010-01-01

    Pandemic influenza has great potential to cause large and rapid increases in deaths and serious illness. The objective of this paper is to develop an agent-based model to simulate the spread of pandemic influenza (novel H1N1) in Egypt. The proposed multi-agent model is based on the modeling of individuals' interactions in a space time context. The proposed model involves different types of parameters such as: social agent attributes, distribution of Egypt population, and patterns of agents' interactions. Analysis of modeling results leads to understanding the characteristics of the modeled pandemic, transmission patterns, and the conditions under which an outbreak might occur. In addition, the proposed model is used to measure the effectiveness of different control strategies to intervene the pandemic spread.

  1. Next frontier in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Robu, Valentin

    2015-01-01

    This book focuses on automated negotiations based on multi-agent systems. It is intended for researchers and students in various fields involving autonomous agents and multi-agent systems, such as e-commerce tools, decision-making and negotiation support systems, and collaboration tools. The contents will help them to understand the concept of automated negotiations, negotiation protocols, negotiating agents’ strategies, and the applications of those strategies. In this book, some negotiation protocols focusing on the multiple interdependent issues in negotiations are presented, making it possible to find high-quality solutions for the complex agents’ utility functions. This book is a compilation of the extended versions of the very best papers selected from the many that were presented at the International Workshop on Agent-Based Complex Automated Negotiations.

  2. Generalization performance of graph-based semisupervised classification

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Semi-supervised learning has been of growing interest over the past few years and many methods have been proposed. Although various algorithms are provided to implement semi-supervised learning,there are still gaps in our understanding of the dependence of generalization error on the numbers of labeled and unlabeled data. In this paper,we consider a graph-based semi-supervised classification algorithm and establish its generalization error bounds. Our results show the close relations between the generalization performance and the structural invariants of data graph.

  3. Hydrophobicity classification of polymeric materials based on fractal dimension

    Directory of Open Access Journals (Sweden)

    Daniel Thomazini

    2008-12-01

    Full Text Available This study proposes a new method to obtain hydrophobicity classification (HC in high voltage polymer insulators. In the method mentioned, the HC was analyzed by fractal dimension (fd and its processing time was evaluated having as a goal the application in mobile devices. Texture images were created from spraying solutions produced of mixtures of isopropyl alcohol and distilled water in proportions, which ranged from 0 to 100% volume of alcohol (%AIA. Based on these solutions, the contact angles of the drops were measured and the textures were used as patterns for fractal dimension calculations.

  4. An AIS-Based E-mail Classification Method

    Science.gov (United States)

    Qing, Jinjian; Mao, Ruilong; Bie, Rongfang; Gao, Xiao-Zhi

    This paper proposes a new e-mail classification method based on the Artificial Immune System (AIS), which is endowed with good diversity and self-adaptive ability by using the immune learning, immune memory, and immune recognition. In our method, the features of spam and non-spam extracted from the training sets are combined together, and the number of false positives (non-spam messages that are incorrectly classified as spam) can be reduced. The experimental results demonstrate that this method is effective in reducing the false rate.

  5. Commercial Shot Classification Based on Multiple Features Combination

    Science.gov (United States)

    Liu, Nan; Zhao, Yao; Zhu, Zhenfeng; Ni, Rongrong

    This paper presents a commercial shot classification scheme combining well-designed visual and textual features to automatically detect TV commercials. To identify the inherent difference between commercials and general programs, a special mid-level textual descriptor is proposed, aiming to capture the spatio-temporal properties of the video texts typical of commercials. In addition, we introduce an ensemble-learning based combination method, named Co-AdaBoost, to interactively exploit the intrinsic relations between the visual and textual features employed.

  6. A Method for Data Classification Based on Discernibility Matrix and Discernibility Function

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A method for data classification will influence the efficiency of classification. Attributes reduction based on discernibility matrix and discernibility function in rough sets can use in data classification, so we put forward a method for data classification. Namely, firstly, we use discernibility matrix and discernibility function to delete superfluous attributes in formation system and get a necessary attribute set. Secondly, we delete superfluous attribute values and get decision rules. Finally, we classify data by means of decision rules. The experiments show that data classification using this method is simpler in the structure, and can improve the efficiency of classification.

  7. Semi-Supervised Classification based on Gaussian Mixture Model for remote imagery

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Semi-Supervised Classification (SSC),which makes use of both labeled and unlabeled data to determine classification borders in feature space,has great advantages in extracting classification information from mass data.In this paper,a novel SSC method based on Gaussian Mixture Model (GMM) is proposed,in which each class’s feature space is described by one GMM.Experiments show the proposed method can achieve high classification accuracy with small amount of labeled data.However,for the same accuracy,supervised classification methods such as Support Vector Machine,Object Oriented Classification,etc.should be provided with much more labeled data.

  8. Applications of Neural-Based Agents in Computer Game Design

    OpenAIRE

    Qualls, Joseph; Russomanno, David J.

    2009-01-01

    It is clear from the implementation and analysis of the performance of the game Defend and Gather and the many other examples discussed in this chapter that neural-based agents have the ability to overcome some of the shortcomings associated with implementing classical AI techniques in computer game design. Neural networks can be used in many diverse ways in computer games ranging from agent control, environmental evolution, to content generation. As outlined in Section 3 of this chapter, by ...

  9. Design of distance teaching platform based on Agent technology

    Institute of Scientific and Technical Information of China (English)

    LI Xiaoming; SUN Hongmin; WU Wansheng

    2007-01-01

    The computer network technology and multi-media technology offer a new teaching mode for the distance education. Now there are still many problems in modern distance education such as weak generality, flexibility and intelligence, etc. This paper brought up a design mode of distance teaching platform based on Agent mechanism and the concrete implementation method through analyzing the characteristic and structure of Agent technology.

  10. Agent-based Model Construction in Financial Economic System

    OpenAIRE

    Hokky Situngkir; Yohanes Surya

    2004-01-01

    The paper gives picture of enrichment to economic and financial system analysis using agent-based models as a form of advanced study for financial economic data post-statistical-data analysis and micro- simulation analysis. Theoretical exploration is carried out by using comparisons of some usual financial economy system models frequently and popularly used in econophysics and computational finance. Primitive model, which consists of agent microsimulation with fundamentalist strategy, chartis...

  11. Agent-based decision making through intelligent knowledge discovery

    OpenAIRE

    Fernández Caballero, Antonio; Sokolova, Marina

    2008-01-01

    Monitoring of negative effects of urban pollution and real-time decision making allow to clarify consequences upon human health. Large amounts of raw data information describe this situation, and to get knowledge from it, we apply intelligent agents. Further modeling and simulation gives the new knowledge about the tendencies of situation development and about its structure. Agent-based decision support system can help to foresee possible ways of situation development and contribute to effect...

  12. A cooperative agent-based security framework

    OpenAIRE

    Cunha, Carlos R.; Gomes, João Pedro; Morais, Elisabete Paulo

    2013-01-01

    The actual economic paradigm is based on a strongly cooperative model that tries to support a more competitive and global organizations response. With cooperation comes an intrinsic need - interconnection and interoperability of information systems among business partners. This represents, in many areas, a huge organizational challenge, being the field of information, and communication security one emerging key issue and a natural enabler for cooperative behavior and to the proper establishme...

  13. Feature selection gait-based gender classification under different circumstances

    Science.gov (United States)

    Sabir, Azhin; Al-Jawad, Naseer; Jassim, Sabah

    2014-05-01

    This paper proposes a gender classification based on human gait features and investigates the problem of two variations: clothing (wearing coats) and carrying bag condition as addition to the normal gait sequence. The feature vectors in the proposed system are constructed after applying wavelet transform. Three different sets of feature are proposed in this method. First, Spatio-temporal distance that is dealing with the distance of different parts of the human body (like feet, knees, hand, Human Height and shoulder) during one gait cycle. The second and third feature sets are constructed from approximation and non-approximation coefficient of human body respectively. To extract these two sets of feature we divided the human body into two parts, upper and lower body part, based on the golden ratio proportion. In this paper, we have adopted a statistical method for constructing the feature vector from the above sets. The dimension of the constructed feature vector is reduced based on the Fisher score as a feature selection method to optimize their discriminating significance. Finally k-Nearest Neighbor is applied as a classification method. Experimental results demonstrate that our approach is providing more realistic scenario and relatively better performance compared with the existing approaches.

  14. Forest Classification Based on Forest texture in Northwest Yunnan Province

    International Nuclear Information System (INIS)

    Forest texture is an intrinsic characteristic and an important visual feature of a forest ecological system. Full utilization of forest texture will be a great help in increasing the accuracy of forest classification based on remote sensed data. Taking Shangri-La as a study area, forest classification has been based on the texture. The results show that: (1) From the texture abundance, texture boundary, entropy as well as visual interpretation, the combination of Grayscale-gradient co-occurrence matrix and wavelet transformation is much better than either one of both ways of forest texture information extraction; (2) During the forest texture information extraction, the size of the texture-suitable window determined by the semi-variogram method depends on the forest type (evergreen broadleaf forest is 3×3, deciduous broadleaf forest is 5×5, etc.). (3)While classifying forest based on forest texture information, the texture factor assembly differs among forests: Variance Heterogeneity and Correlation should be selected when the window is between 3×3 and 5×5; Mean, Correlation, and Entropy should be used when the window in the range of 7×7 to 19×19; and Correlation, Second Moment, and Variance should be used when the range is larger than 21×21

  15. Classification Based on Hierarchical Linear Models: The Need for Incorporation of Social Contexts in Classification Analysis

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Qui

    2009-01-01

    Many areas in educational and psychological research involve the use of classification statistical analysis. For example, school districts might be interested in attaining variables that provide optimal prediction of school dropouts. In psychology, a researcher might be interested in the classification of a subject into a particular psychological…

  16. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  17. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  18. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  19. An Immunity-Based Anomaly Detection System with Sensor Agents

    Directory of Open Access Journals (Sweden)

    Yoshiteru Ishida

    2009-11-01

    Full Text Available This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user’s command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.

  20. A distributed agent-based architecture for dynamic services

    International Nuclear Information System (INIS)

    A prototype system for agent-based distributed dynamic services that will be applied to the development of Data Grids for high-energy physics is presented. The agent-based systems we are designing and developing gather, disseminate and coordinate configuration, time-dependent state and other information in the Grid system as a whole. These systems are being developed as an enabling technology for workflow-management and other forms of end-to-end Grid system monitoring and management. This prototype is being developed in Java and is based on the JINI support for distributed applications

  1. Analyzing the ENRON Communication Network Using Agent-Based Simulation

    Directory of Open Access Journals (Sweden)

    Shinako Matsuyama

    2008-07-01

    Full Text Available Agent-based modeling, simulation, and network analysis approaches are one of the emergent techniques among soft computing literature. This paper presents an agent-based model for analyzing the characteristics of peer-to-peer human communication networks. We focus on the process of the collapse of Enron Corporation, which is an interesting topic among the business management domain. The Enron email dataset is available for the analysis. Our approach consists of the four steps: First, macro-level characteristics of the Enron email dataset is analyzed from the viewpoints of social network theory: (i the degrees of the communication networks and contents information, and (ii the changes of network structures among the major events. Second, for the micro-level analysis, an agent-based simulator is implemented using the Enron email dataset. Third, both micro- and macro- level characteristics are calculated on the simulator to ground the model to the dataset. Finally, a different artificial society from the Enron email dataset is developed the simulator and we compare its characteristics of communication patterns with the result of the ones in the agent-based simulation with the Enron email dataset. The investigation suggests that the agent-based model is beneficial to uncover the characteristics of implicit communication mechanisms of the firm.

  2. Nanochemistry of Protein-Based Delivery Agents.

    Science.gov (United States)

    Rajendran, Subin R C K; Udenigwe, Chibuike C; Yada, Rickey Y

    2016-01-01

    The past decade has seen an increased interest in the conversion of food proteins into functional biomaterials, including their use for loading and delivery of physiologically active compounds such as nutraceuticals and pharmaceuticals. Proteins possess a competitive advantage over other platforms for the development of nanodelivery systems since they are biocompatible, amphipathic, and widely available. Proteins also have unique molecular structures and diverse functional groups that can be selectively modified to alter encapsulation and release properties. A number of physical and chemical methods have been used for preparing protein nanoformulations, each based on different underlying protein chemistry. This review focuses on the chemistry of the reorganization and/or modification of proteins into functional nanostructures for delivery, from the perspective of their preparation, functionality, stability and physiological behavior. PMID:27489854

  3. A New Classification Analysis of Customer Requirement Information Based on Quantitative Standardization for Product Configuration

    OpenAIRE

    Zheng Xiao; Zude Zhou; Buyun Sheng

    2016-01-01

    Traditional methods used for the classification of customer requirement information are typically based on specific indicators, hierarchical structures, and data formats and involve a qualitative analysis in terms of stationary patterns. Because these methods neither consider the scalability of classification results nor do they regard subsequent application to product configuration, their classification becomes an isolated operation. However, the transformation of customer requirement inform...

  4. Style-based classification of Chinese ink and wash paintings

    Science.gov (United States)

    Sheng, Jiachuan; Jiang, Jianmin

    2013-09-01

    Following the fact that a large collection of ink and wash paintings (IWP) is being digitized and made available on the Internet, their automated content description, analysis, and management are attracting attention across research communities. While existing research in relevant areas is primarily focused on image processing approaches, a style-based algorithm is proposed to classify IWPs automatically by their authors. As IWPs do not have colors or even tones, the proposed algorithm applies edge detection to locate the local region and detect painting strokes to enable histogram-based feature extraction and capture of important cues to reflect the styles of different artists. Such features are then applied to drive a number of neural networks in parallel to complete the classification, and an information entropy balanced fusion is proposed to make an integrated decision for the multiple neural network classification results in which the entropy is used as a pointer to combine the global and local features. Evaluations via experiments support that the proposed algorithm achieves good performances, providing excellent potential for computerized analysis and management of IWPs.

  5. ECG-based heartbeat classification for arrhythmia detection: A survey.

    Science.gov (United States)

    Luz, Eduardo José da S; Schwartz, William Robson; Cámara-Chávez, Guillermo; Menotti, David

    2016-04-01

    An electrocardiogram (ECG) measures the electric activity of the heart and has been widely used for detecting heart diseases due to its simplicity and non-invasive nature. By analyzing the electrical signal of each heartbeat, i.e., the combination of action impulse waveforms produced by different specialized cardiac tissues found in the heart, it is possible to detect some of its abnormalities. In the last decades, several works were developed to produce automatic ECG-based heartbeat classification methods. In this work, we survey the current state-of-the-art methods of ECG-based automated abnormalities heartbeat classification by presenting the ECG signal preprocessing, the heartbeat segmentation techniques, the feature description methods and the learning algorithms used. In addition, we describe some of the databases used for evaluation of methods indicated by a well-known standard developed by the Association for the Advancement of Medical Instrumentation (AAMI) and described in ANSI/AAMI EC57:1998/(R)2008 (ANSI/AAMI, 2008). Finally, we discuss limitations and drawbacks of the methods in the literature presenting concluding remarks and future challenges, and also we propose an evaluation process workflow to guide authors in future works. PMID:26775139

  6. Proposed classification of medial maxillary labial frenum based on morphology

    Directory of Open Access Journals (Sweden)

    Ranjana Mohan

    2014-01-01

    Full Text Available Objectives: To propose a new classification of median maxillary labial frenum (MMLF based on the morphology in permanent dentition, conducting a cross-sectional survey. Materials and Methods: Unicentric study was conducted on 2,400 adults (1,414 males, 986 females, aged between 18 and 76 years, with mean age = 38.62, standard deviation (SD = 12.53. Male mean age = 38.533 years and male SD = 12.498. Female mean age = 38.71 and female SD = 12.5750 for a period of 6 months at Teerthanker Mahaveer University, Moradabad, Northern India. The frenum morphology was determined by using the direct visual method under natural light and categorized. Results: Diverse frenum morphologies were observed. Several variations found in the study have not been documented in the past literature and were named and classified according to their morphology. Discussion: The MMLF presents a diverse array of morphological variations. Several other undocumented types of frena were observed and revised, detailed classification has been proposed based on cross-sectional survey.

  7. A Cluster Based Approach for Classification of Web Results

    Directory of Open Access Journals (Sweden)

    Apeksha Khabia

    2014-12-01

    Full Text Available Nowadays significant amount of information from web is present in the form of text, e.g., reviews, forum postings, blogs, news articles, email messages, web pages. It becomes difficult to classify documents in predefined categories as the number of document grows. Clustering is the classification of a data into clusters, so that the data in each cluster share some common trait – often vicinity according to some defined measure. Underlying distribution of data set can somewhat be depicted based on the learned clusters under the guidance of initial data set. Thus, clusters of documents can be employed to train the classifier by using defined features of those clusters. One of the important issues is also to classify the text data from web into different clusters by mining the knowledge. Conforming to that, this paper presents a review on most of document clustering technique and cluster based classification techniques used so far. Also pre-processing on text dataset and document clustering method is explained in brief.

  8. Understanding Acupuncture Based on ZHENG Classification from System Perspective

    Directory of Open Access Journals (Sweden)

    Junwei Fang

    2013-01-01

    Full Text Available Acupuncture is an efficient therapy method originated in ancient China, the study of which based on ZHENG classification is a systematic research on understanding its complexity. The system perspective is contributed to understand the essence of phenomena, and, as the coming of the system biology era, broader technology platforms such as omics technologies were established for the objective study of traditional chinese medicine (TCM. Omics technologies could dynamically determine molecular components of various levels, which could achieve a systematic understanding of acupuncture by finding out the relationships of various response parts. After reviewing the literature of acupuncture studied by omics approaches, the following points were found. Firstly, with the help of omics approaches, acupuncture was found to be able to treat diseases by regulating the neuroendocrine immune (NEI network and the change of which could reflect the global effect of acupuncture. Secondly, the global effect of acupuncture could reflect ZHENG information at certain structure and function levels, which might reveal the mechanism of Meridian and Acupoint Specificity. Furthermore, based on comprehensive ZHENG classification, omics researches could help us understand the action characteristics of acupoints and the molecular mechanisms of their synergistic effect.

  9. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation.

    Science.gov (United States)

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  10. Pixel classification based color image segmentation using quaternion exponent moments.

    Science.gov (United States)

    Wang, Xiang-Yang; Wu, Zhi-Fang; Chen, Liang; Zheng, Hong-Liang; Yang, Hong-Ying

    2016-02-01

    Image segmentation remains an important, but hard-to-solve, problem since it appears to be application dependent with usually no a priori information available regarding the image structure. In recent years, many image segmentation algorithms have been developed, but they are often very complex and some undesired results occur frequently. In this paper, we propose a pixel classification based color image segmentation using quaternion exponent moments. Firstly, the pixel-level image feature is extracted based on quaternion exponent moments (QEMs), which can capture effectively the image pixel content by considering the correlation between different color channels. Then, the pixel-level image feature is used as input of twin support vector machines (TSVM) classifier, and the TSVM model is trained by selecting the training samples with Arimoto entropy thresholding. Finally, the color image is segmented with the trained TSVM model. The proposed scheme has the following advantages: (1) the effective QEMs is introduced to describe color image pixel content, which considers the correlation between different color channels, (2) the excellent TSVM classifier is utilized, which has lower computation time and higher classification accuracy. Experimental results show that our proposed method has very promising segmentation performance compared with the state-of-the-art segmentation approaches recently proposed in the literature. PMID:26618250

  11. Target Image Classification through Encryption Algorithm Based on the Biological Features

    OpenAIRE

    Zhiwu Chen; Qing E. Wu; Weidong Yang

    2014-01-01

    In order to effectively make biological image classification and identification, this paper studies the biological owned characteristics, gives an encryption algorithm, and presents a biological classification algorithm based on the encryption process. Through studying the composition characteristics of palm, this paper uses the biological classification algorithm to carry out the classification or recognition of palm, improves the accuracy and efficiency of the existing biological classifica...

  12. Rainfall Prediction using Data-Core Based Fuzzy Min-Max Neural Network for Classification

    OpenAIRE

    Rajendra Palange,; Nishikant Pachpute

    2015-01-01

    This paper proposes the Rainfall Prediction System by using classification technique. The advanced and modified neural network called Data Core Based Fuzzy Min Max Neural Network (DCFMNN) is used for pattern classification. This classification method is applied to predict Rainfall. The neural network called fuzzy min max neural network (FMNN) that creates hyperboxes for classification and predication, has a problem of overlapping neurons that resoled in DCFMNN to give greater accu...

  13. An Assessment of Case Base Reasoning for Short Text Message Classification

    OpenAIRE

    Healy, Matt, (Thesis); Delany, Sarah Jane; Zamolotskikh, Anton

    2004-01-01

    Message classification is a text classification task that has provoked much interest in machine learning. One aspect of message classification that presents a particular challenge is the classification of short text messages. This paper presents an assessment of applying a case based approach that was developed for long text messages (specifically spam filtering) to short text messages. The evaluation involves determining the most appropriate feature types and feature representation for short...

  14. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  15. Agent-based analysis of organizations : formalization and simulation

    OpenAIRE

    Dignum, M.V.; Tick, C.

    2008-01-01

    Organizational effectiveness depends on many factors, including individual excellence, efficient structures, effective planning and capability to understand and match context requirements. We propose a way to model organizational performance based on a combination of formal models and agent-based simulation that supports the analysis of the congruence of different organizational structures to changing environments

  16. An Agent Communication Framework Based on XML and SOAP Technique

    Institute of Scientific and Technical Information of China (English)

    李晓瑜

    2009-01-01

    This thesis introducing XML technology and SOAP technology,present an agent communication fi-amework based on XML and SOAP technique,and analyze the principle,architecture,function and benefit of it. At the end, based on KQML communication primitive lan- guages.

  17. A role based coordination model in agent systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ya-ying; YOU Jin-yuan

    2005-01-01

    Coordination technology addresses the construction of open, flexible systems from active and independent software agents in concurrent and distributed systems. In most open distributed applications, multiple agents need interaction and communication to achieve their overall goal. Coordination technologies for the Internet typically are concerned with enabling interaction among agents and helping them cooperate with each other.At the same time, access control should also be considered to constrain interaction to make it harmless. Access control should be regarded as the security counterpart of coordination. At present, the combination of coordination and access control remains an open problem. Thus, we propose a role based coordination model with policy enforcement in agent application systems. In this model, coordination is combined with access control so as to fully characterize the interactions in agent systems. A set of agents interacting with each other for a common global system task constitutes a coordination group. Role based access control is applied in this model to prevent unauthorized accesses. Coordination policy is enforced in a distributed manner so that the model can be applied to the open distributed systems such as Intemet. An Internet online auction system is presented as a case study to illustrate the proposed coordination model and finally the performance analysis of the model is introduced.

  18. Study on the agile supply chain management based on agent

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The most important task of the agile supply chain management (ASCM) is to reconfigure a supply chain based on the customers' requirement. Without more sophisticated cooperation and dynamic formation in an agile supply chain, it cannot be achieved for mass customization, rapid response and high quality services. Because of its great potential in supporting cooperation for the supply chain management, agent technology can carry out the cooperative work by inter-operation across networked human, organization and machines at the abstractive level in a computational system. A major challenge in building such a system is to coordinate the behavior of individual agent or a group of agents to achieve the individual and shared goals of the participants. In this paper, the agent technology is used to support modeling and coordinating of supply chain management.

  19. Web Crawler Based on Mobile Agent and Java Aglets

    Directory of Open Access Journals (Sweden)

    Md. Abu Kausar

    2013-09-01

    Full Text Available With the huge growth of the Internet, many web pages are available online. Search engines use web crawlers to collect these web pages from World Wide Web for the purpose of storage and indexing. Basically Web Crawler is a program, which finds information from the World Wide Web in a systematic and automated manner. This network load farther will be reduced by using mobile agents.The proposed approach uses mobile agents to crawl the pages. A mobile agent is not bound to the system in which it starts execution. It has the unique ability to transfer itself from one system in a network to another system. The main advantages of web crawler based on Mobile Agents are that the analysis part of the crawling process is done locally rather than remote side. This drastically reduces network load and traffic which can improve the performance and efficiency of the whole crawling process.

  20. The architectural foundations for agent-based shop floor control

    DEFF Research Database (Denmark)

    Langer, Gilad; Bilberg, Arne

    1998-01-01

    The emerging theory regardingHolonic Manufacturing Systems (HMS) presents a advantageoustheoretical foundation for the control system of themanufacturing system of the future. Previous research, at theDepartment, has demonstrated how company tailored shop floorcontrol can be developed by applying...... the HoMuCS architecture can berealised by using multi-agent technology,and that it is also therequired foundation for implementation of agent technology inmanufacturing system control. The work is based on a theoreticalstudy of new manufacturing system theories, research of agent and multi...... simulation and cell controlenabling technologies. In order to continuethis research effortnew concepts and theories for shop floor control are investigated.This paper reviews the multi-agent concept aimed at investigatingits potential use in shop floor control systems. The paper willalso include a survey of...

  1. AGENT based structural static and dynamic collaborative optimization

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A static and dynamic collaborative optimization mode for complex machine system and itsontology project relationship are put forward, on which an agent-based structural static and dynamiccollaborative optimization system is constructed as two agent colonies: optimization agent colony andfinite element analysis colony. And a two-level solving strategy as well as the necessity and possibilityfor handing with finite element analysis model in multi-level mode is discussed. Furthermore, the coop-eration of all FEA agents for optimal design of complicated structural is studied in detail. Structural stat-ic and dynamic collaborative optimization of hydraulic excavator working equimpent is taken as an ex-ample to show that the system is reliable.

  2. Utilizing ECG-Based Heartbeat Classification for Hypertrophic Cardiomyopathy Identification.

    Science.gov (United States)

    Rahman, Quazi Abidur; Tereshchenko, Larisa G; Kongkatong, Matthew; Abraham, Theodore; Abraham, M Roselle; Shatkay, Hagit

    2015-07-01

    Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is (potentially fatally) obstructed. A test based on electrocardiograms (ECG) that record the heart electrical activity can help in early detection of HCM patients. This paper presents a cardiovascular-patient classifier we developed to identify HCM patients using standard 10-second, 12-lead ECG signals. Patients are classified as having HCM if the majority of their recorded heartbeats are recognized as characteristic of HCM. Thus, the classifier's underlying task is to recognize individual heartbeats segmented from 12-lead ECG signals as HCM beats, where heartbeats from non-HCM cardiovascular patients are used as controls. We extracted 504 morphological and temporal features—both commonly used and newly-developed ones—from ECG signals for heartbeat classification. To assess classification performance, we trained and tested a random forest classifier and a support vector machine classifier using 5-fold cross validation. We also compared the performance of these two classifiers to that obtained by a logistic regression classifier, and the first two methods performed better than logistic regression. The patient-classification precision of random forests and of support vector machine classifiers is close to 0.85. Recall (sensitivity) and specificity are approximately 0.90. We also conducted feature selection experiments by gradually removing the least informative features; the results show that a relatively small subset of 264 highly informative features can achieve performance measures comparable to those achieved by using the complete set of features. PMID:25915962

  3. Agent-based simulation of electricity markets : a literature review

    International Nuclear Information System (INIS)

    The electricity sector in Europe and North America is undergoing considerable changes as a result of deregulation, issues related to climate change, and the integration of renewable resources within the electricity grid. This article reviewed agent-based simulation methods of analyzing electricity markets. The paper provided an analysis of research currently being conducted on electricity market designs and examined methods of modelling agent decisions. Methods of coupling long term and short term decisions were also reviewed. Issues related to single and multiple market analysis methods were discussed, as well as different approaches to integrating agent-based models with models of other commodities. The integration of transmission constraints within agent-based models was also discussed, and methods of measuring market efficiency were evaluated. Other topics examined in the paper included approaches to integrating investment decisions, carbon dioxide (CO2) trading, and renewable support schemes. It was concluded that agent-based models serve as a test bed for the electricity sector, and will help to provide insights for future policy decisions. 74 refs., 6 figs

  4. Macromolecular and dendrimer-based magnetic resonance contrast agents

    International Nuclear Information System (INIS)

    Magnetic resonance imaging (MRI) is a powerful imaging modality that can provide an assessment of function or molecular expression in tandem with anatomic detail. Over the last 20-25 years, a number of gadolinium-based MR contrast agents have been developed to enhance signal by altering proton relaxation properties. This review explores a range of these agents from small molecule chelates, such as Gd-DTPA and Gd-DOTA, to macromolecular structures composed of albumin, polylysine, polysaccharides (dextran, inulin, starch), poly(ethylene glycol), copolymers of cystamine and cystine with GD-DTPA, and various dendritic structures based on polyamidoamine and polylysine (Gadomers). The synthesis, structure, biodistribution, and targeting of dendrimer-based MR contrast agents are also discussed

  5. Macromolecular and dendrimer-based magnetic resonance contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Bumb, Ambika; Brechbiel, Martin W. (Radiation Oncology Branch, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States)), e-mail: pchoyke@mail.nih.gov; Choyke, Peter (Molecular Imaging Program, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States))

    2010-09-15

    Magnetic resonance imaging (MRI) is a powerful imaging modality that can provide an assessment of function or molecular expression in tandem with anatomic detail. Over the last 20-25 years, a number of gadolinium-based MR contrast agents have been developed to enhance signal by altering proton relaxation properties. This review explores a range of these agents from small molecule chelates, such as Gd-DTPA and Gd-DOTA, to macromolecular structures composed of albumin, polylysine, polysaccharides (dextran, inulin, starch), poly(ethylene glycol), copolymers of cystamine and cystine with GD-DTPA, and various dendritic structures based on polyamidoamine and polylysine (Gadomers). The synthesis, structure, biodistribution, and targeting of dendrimer-based MR contrast agents are also discussed

  6. Hyperspectral image classification based on spatial and spectral features and sparse representation

    Institute of Scientific and Technical Information of China (English)

    Yang Jing-Hui; Wang Li-Guo; Qian Jin-Xi

    2014-01-01

    To minimize the low classification accuracy and low utilization of spatial information in traditional hyperspectral image classification methods, we propose a new hyperspectral image classification method, which is based on the Gabor spatial texture features and nonparametric weighted spectral features, and the sparse representation classification method (Gabor–NWSF and SRC), abbreviated GNWSF–SRC. The proposed (GNWSF–SRC) method first combines the Gabor spatial features and nonparametric weighted spectral features to describe the hyperspectral image, and then applies the sparse representation method. Finally, the classification is obtained by analyzing the reconstruction error. We use the proposed method to process two typical hyperspectral data sets with different percentages of training samples. Theoretical analysis and simulation demonstrate that the proposed method improves the classification accuracy and Kappa coefficient compared with traditional classification methods and achieves better classification performance.

  7. Smart Agent Learning based Hotel Search System- Android Environment

    Directory of Open Access Journals (Sweden)

    Wayne Lawrence

    2012-08-01

    Full Text Available The process of finding the finest hotel in central location is time consuming, information overload and overwhelming and in some cases poses a security risk to the client. Over time with competition in the market among travel agents and hotels, the process of hotel search and booking has improved with the advances in technology. Various web sites allow a user to select a destination from a pull-down list along with several categories to suit one’s preference.. Some of the more advanced web sites allow for a search of the destination via a map for example hotelguidge.com and jamaica.hotels.hu. Recently good amount of work been carried in the use of Intelligent agents towards hotel search on J2ME based mobile handset which still has some weakness. The proposed system so developed uses smart software agents that overcomes the weakness in the previous system by collaborating among themselves and search Google map based on criteria selected by the user and return results to the client that is precise and best suit the user requirements. In addition, the agent possesses learning capability of searching the hotels too which is based on past search experience. The booking of hotel involving cryptography has not been incorporated in this research paper and been published elsewhere. This will be facilitated on Android 2.2-enabled mobile phone using JADE-LEAP Agent development kit.

  8. Classification of EMG Signal Based on Human Percentile using SOM

    Directory of Open Access Journals (Sweden)

    M.H. Jali

    2014-07-01

    Full Text Available Electromyography (EMG is a bio signal that is formed by physiological variations in the state of muscle fibre membranes. Pattern recognition is one of the fields in the bio-signal processing which classified the signal into certain desired categories with subject to their area of application. This study described the classification of the EMG signal based on human body percentile using Self Organizing Mapping (SOM technique. Different human percentile definitively varies the arm circumference size. Variation of arm circumference is due to fatty tissue that lay between active muscle and skin. Generally the fatty tissue would decrease the overall amplitude of the EMG signal. Data collection is conducted randomly with fifteen subjects that have numerous percentiles using non-invasive technique at Biceps Brachii muscle. The signals are then going through filtering process to prepare them for the next stage. Then, five well known time domain feature extraction methods are applied to the signal before the classification process. Self Organizing Map (SOM technique is used as a classifier to discriminate between the human percentiles. Result shows that SOM is capable in clustering the EMG signal to the desired human percentile categories by optimizing the neurons of the technique.

  9. Neighborhood Hypergraph Based Classification Algorithm for Incomplete Information System

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2015-01-01

    Full Text Available The problem of classification in incomplete information system is a hot issue in intelligent information processing. Hypergraph is a new intelligent method for machine learning. However, it is hard to process the incomplete information system by the traditional hypergraph, which is due to two reasons: (1 the hyperedges are generated randomly in traditional hypergraph model; (2 the existing methods are unsuitable to deal with incomplete information system, for the sake of missing values in incomplete information system. In this paper, we propose a novel classification algorithm for incomplete information system based on hypergraph model and rough set theory. Firstly, we initialize the hypergraph. Second, we classify the training set by neighborhood hypergraph. Third, under the guidance of rough set, we replace the poor hyperedges. After that, we can obtain a good classifier. The proposed approach is tested on 15 data sets from UCI machine learning repository. Furthermore, it is compared with some existing methods, such as C4.5, SVM, NavieBayes, and KNN. The experimental results show that the proposed algorithm has better performance via Precision, Recall, AUC, and F-measure.

  10. Computational hepatocellular carcinoma tumor grading based on cell nuclei classification.

    Science.gov (United States)

    Atupelage, Chamidu; Nagahashi, Hiroshi; Kimura, Fumikazu; Yamaguchi, Masahiro; Tokiya, Abe; Hashiguchi, Akinori; Sakamoto, Michiie

    2014-10-01

    Hepatocellular carcinoma (HCC) is the most common histological type of primary liver cancer. HCC is graded according to the malignancy of the tissues. It is important to diagnose low-grade HCC tumors because these tissues have good prognosis. Image interpretation-based computer-aided diagnosis (CAD) systems have been developed to automate the HCC grading process. Generally, the HCC grade is determined by the characteristics of liver cell nuclei. Therefore, it is preferable that CAD systems utilize only liver cell nuclei for HCC grading. This paper proposes an automated HCC diagnosing method. In particular, it defines a pipeline-path that excludes nonliver cell nuclei in two consequent pipeline-modules and utilizes the liver cell nuclear features for HCC grading. The significance of excluding the nonliver cell nuclei for HCC grading is experimentally evaluated. Four categories of liver cell nuclear features were utilized for classifying the HCC tumors. Results indicated that nuclear texture is the dominant feature for HCC grading and others contribute to increase the classification accuracy. The proposed method was employed to classify a set of regions of interest selected from HCC whole slide images into five classes and resulted in a 95.97% correct classification rate. PMID:26158066

  11. Texture-Based Automated Lithological Classification Using Aeromagenetic Anomaly Images

    Science.gov (United States)

    Shankar, Vivek

    2009-01-01

    This report consists of a thesis submitted to the faculty of the Department of Electrical and Computer Engineering, in partial fulfillment of the requirements for the degree of Master of Science, Graduate College, The University of Arizona, 2004 Aeromagnetic anomaly images are geophysical prospecting tools frequently used in the exploration of metalliferous minerals and hydrocarbons. The amplitude and texture content of these images provide a wealth of information to geophysicists who attempt to delineate the nature of the Earth's upper crust. These images prove to be extremely useful in remote areas and locations where the minerals of interest are concealed by basin fill. Typically, geophysicists compile a suite of aeromagnetic anomaly images, derived from amplitude and texture measurement operations, in order to obtain a qualitative interpretation of the lithological (rock) structure. Texture measures have proven to be especially capable of capturing the magnetic anomaly signature of unique lithological units. We performed a quantitative study to explore the possibility of using texture measures as input to a machine vision system in order to achieve automated classification of lithological units. This work demonstrated a significant improvement in classification accuracy over random guessing based on a priori probabilities. Additionally, a quantitative comparison between the performances of five classes of texture measures in their ability to discriminate lithological units was achieved.

  12. Classification of chronic obstructive pulmonary disease based on chest radiography

    Directory of Open Access Journals (Sweden)

    Leilane Marcos

    2013-12-01

    Full Text Available Objective Quantitative analysis of chest radiographs of patients with and without chronic obstructive pulmonary disease (COPD determining if the data obtained from such radiographic images could classify such individuals according to the presence or absence of disease. Materials and Methods For such a purpose, three groups of chest radiographic images were utilized, namely: group 1, including 25 individuals with COPD; group 2, including 27 individuals without COPD; and group 3 (utilized for the reclassification /validation of the analysis, including 15 individuals with COPD. The COPD classification was based on spirometry. The variables normalized by retrosternal height were the following: pulmonary width (LARGP; levels of right (ALBDIR and left (ALBESQ diaphragmatic eventration; costophrenic angle (ANGCF; and right (DISDIR and left (DISESQ intercostal distances. Results As the radiographic images of patients with and without COPD were compared, statistically significant differences were observed between the two groups on the variables related to the diaphragm. In the COPD reclassification the following variables presented the highest indices of correct classification: ANGCF (80%, ALBDIR (73.3%, ALBESQ (86.7%. Conclusion The radiographic assessment of the chest demonstrated that the variables related to the diaphragm allow a better differentiation between individuals with and without COPD.

  13. Classification of knee arthropathy with accelerometer-based vibroarthrography.

    Science.gov (United States)

    Moreira, Dinis; Silva, Joana; Correia, Miguel V; Massada, Marta

    2016-01-01

    One of the most common knee joint disorders is known as osteoarthritis which results from the progressive degeneration of cartilage and subchondral bone over time, affecting essentially elderly adults. Current evaluation techniques are either complex, expensive, invasive or simply fails into detection of small and progressive changes that occur within the knee. Vibroarthrography appeared as a new solution where the mechanical vibratory signals arising from the knee are recorded recurring only to an accelerometer and posteriorly analyzed enabling the differentiation between a healthy and an arthritic joint. In this study, a vibration-based classification system was created using a dataset with 92 healthy and 120 arthritic segments of knee joint signals collected from 19 healthy and 20 arthritic volunteers, evaluated with k-nearest neighbors and support vector machine classifiers. The best classification was obtained using the k-nearest neighbors classifier with only 6 time-frequency features with an overall accuracy of 89.8% and with a precision, recall and f-measure of 88.3%, 92.4% and 90.1%, respectively. Preliminary results showed that vibroarthrography can be a promising, non-invasive and low cost tool that could be used for screening purposes. Despite this encouraging results, several upgrades in the data collection process and analysis can be further implemented. PMID:27225550

  14. Pro duct Image Classification Based on Fusion Features

    Institute of Scientific and Technical Information of China (English)

    YANG Xiao-hui; LIU Jing-jing; YANG Li-jun

    2015-01-01

    Two key challenges raised by a product images classification system are classi-fication precision and classification time. In some categories, classification precision of the latest techniques, in the product images classification system, is still low. In this paper, we propose a local texture descriptor termed fan refined local binary pattern, which captures more detailed information by integrating the spatial distribution into the local binary pattern feature. We compare our approach with different methods on a subset of product images on Amazon/eBay and parts of PI100 and experimental results have demonstrated that our proposed approach is superior to the current existing methods. The highest classification precision is increased by 21%and the average classification time is reduced by 2/3.

  15. Agent-Based Urban Land Markets: Agent's Pricing Behavior, Land Prices and Urban Land Use Change

    OpenAIRE

    Filatova, Tatiana; Parker, Dawn; Veen, van der, J.T.

    2009-01-01

    We present a new bilateral agent-based land market model, which moves beyond previous work by explicitly modeling behavioral drivers of land-market transactions on both the buyer and seller sides; formation of bid prices (of buyers) and ask prices (of sellers); and the relative division of the gains from trade from the market transactions. We analyze model output using a series of macro-scale economic and landscape pattern measures, including land rent gradients estimated using simple regress...

  16. A Method of Soil Salinization Information Extraction with SVM Classification Based on ICA and Texture Features

    Institute of Scientific and Technical Information of China (English)

    ZHANG Fei; TASHPOLAT Tiyip; KUNG Hsiang-te; DING Jian-li; MAMAT.Sawut; VERNER Johnson; HAN Gui-hong; GUI Dong-wei

    2011-01-01

    Salt-affected soils classification using remotely sensed images is one of the most common applications in remote sensing,and many algorithms have been developed and applied for this purpose in the literature.This study takes the Delta Oasis of Weigan and Kuqa Rivers as a study area and discusses the prediction of soil salinization from ETM+ Landsat data.It reports the Support Vector Machine(SVM) classification method based on Independent Component Analysis(ICA) and Texture features.Meanwhile,the letter introduces the fundamental theory of SVM algorithm and ICA,and then incorporates ICA and texture features.The classification result is compared with ICA-SVM classification,single data source SVM classification,maximum likelihood classification(MLC) and neural network classification qualitatively and quantitatively.The result shows that this method can effectively solve the problem of low accuracy and fracture classification result in single data source classification.It has high spread ability toward higher array input.The overall accuracy is 98.64%,which increases by 10.2% compared with maximum likelihood classification,even increases by 12.94% compared with neural net classification,and thus acquires good effectiveness.Therefore,the classification method based on SVM and incorporating the ICA and texture features can be adapted to RS image classification and monitoring of soil salinization.

  17. TOWARDS AN ONTOLOGY-BASED MULTI-AGENT MEDICAL INFORMATION SYSTEM BASED ON THE WEB

    Institute of Scientific and Technical Information of China (English)

    张全海; 施鹏飞

    2002-01-01

    This paper described an ontology-based multi-agent knowledge process made (MAKM) which is one of multi-agents systems (MAS) and uses semantic network to describe agents to help to locate relative agents distributed in the workgroup. In MAKM, an agent is the entity to implement the distributed task processing and to access the information or knowledge. Knowledge query manipulation language (KQML) is adapted to realize the communication among agents. So using the MAKM mode, different knowledge and information on the medical domain could be organized and utilized efficiently when a collaborative task is implemented on the web.

  18. Towards Designing Multi Agent Mobile and Internet Based Voting System

    Directory of Open Access Journals (Sweden)

    Mohamed Khlaif

    2013-04-01

    Full Text Available Voting systems are essential in most democratic societies .Thevoting process is very difficult and consuming time and effortprocess. One of the major problems of voting is the securityprocess. The E-voting system is process to use mobile multiagents system which can be less time consuming and moreaccurate due to Agent role through encryption/decryption whichreduce the risk of casting vote In trouble environment. the votewill be received by the agent which will be encrypted and sent tomobile data base, and similar action is carried out in the Internetagent who will carry out similar process in a similar manner .Voting data is collected in three different sources whicheventually collected in master data base after decrypting all votes. the counting agent counts the votes and classify votes for eachperspective owner.

  19. Return Migration After Brain Drain: An Agent Based Simulation Approach

    CERN Document Server

    Biondo, A E; Rapisarda, A

    2012-01-01

    The Brain Drain phenomenon is particularly heterogeneous and is characterized by peculiar specifications. It influences the economic fundamentals of both the country of origin and the host one in terms of human capital accumulation. Here, the brain drain is considered from a microeconomic perspective: more precisely we focus on the individual rational decision to return, referring it to the social capital owned by the worker. The presented model, restricted to the case of academic personnel, compares utility levels to justify agent's migration conduct and to simulate several scenarios with a NetLogo agent based model. In particular, we developed a simulation framework based on two fundamental individual features, i.e. risk aversion and initial expectation, which characterize the dynamics of different agents according to the random evolution of their personal social networks. Our main result is that, according to the value of risk aversion and initial expectation, the probability of return migration depends on...

  20. Radiological classification of renal angiomyolipomas based on 127 tumors

    Directory of Open Access Journals (Sweden)

    Prando Adilson

    2003-01-01

    Full Text Available PURPOSE: Demonstrate radiological findings of 127 angiomyolipomas (AMLs and propose a classification based on the radiological evidence of fat. MATERIALS AND METHODS: The imaging findings of 85 consecutive patients with AMLs: isolated (n = 73, multiple without tuberous sclerosis (TS (n = 4 and multiple with TS (n = 8, were retrospectively reviewed. Eighteen AMLs (14% presented with hemorrhage. All patients were submitted to a dedicated helical CT or magnetic resonance studies. All hemorrhagic and non-hemorrhagic lesions were grouped together since our objective was to analyze the presence of detectable fat. Out of 85 patients, 53 were monitored and 32 were treated surgically due to large perirenal component (n = 13, hemorrhage (n = 11 and impossibility of an adequate preoperative characterization (n = 8. There was not a case of renal cell carcinoma (RCC with fat component in this group of patients. RESULTS: Based on the presence and amount of detectable fat within the lesion, AMLs were classified in 4 distinct radiological patterns: Pattern-I, predominantly fatty (usually less than 2 cm in diameter and intrarenal: 54%; Pattern-II, partially fatty (intrarenal or exophytic: 29%; Pattern-III, minimally fatty (most exophytic and perirenal: 11%; and Pattern-IV, without fat (most exophytic and perirenal: 6%. CONCLUSIONS: This proposed classification might be useful to understand the imaging manifestations of AMLs, their differential diagnosis and determine when further radiological evaluation would be necessary. Small (< 1.5 cm, pattern-I AMLs tend to be intra-renal, homogeneous and predominantly fatty. As they grow they tend to be partially or completely exophytic and heterogeneous (patterns II and III. The rare pattern-IV AMLs, however, can be small or large, intra-renal or exophytic but are always homogeneous and hyperdense mass. Since no renal cell carcinoma was found in our series, from an evidence-based practice, all renal mass with detectable

  1. Radiological classification of renal angiomyolipomas based on 127 tumors

    Energy Technology Data Exchange (ETDEWEB)

    Prando, Adilson [Hospital Vera Cruz, Campinas, SP (Brazil). Dept. de Radiologia]. E-mail: aprando@mpc.com.br

    2003-05-15

    Purpose: Demonstrate radiological findings of 127 angiomyolipomas (AMLs) and propose a classification based on the radiological evidence of fat. Materials And Methods: The imaging findings of 85 consecutive patients with AMLs: isolated (n = 73), multiple without tuberous sclerosis (TS) (n = 4) and multiple with TS (n = 8), were retrospectively reviewed. Eighteen AMLs (14%) presented with hemorrhage. All patients were submitted to a dedicated helical CT or magnetic resonance studies. All hemorrhagic and non-hemorrhagic lesions were grouped together since our objective was to analyze the presence of detectable fat. Out of 85 patients, 53 were monitored and 32 were treated surgically due to large perirenal component (n = 13), hemorrhage (n = 11) and impossibility of an adequate preoperative characterization (n = 8). There was not a case of renal cell carcinoma (RCC) with fat component in this group of patients. Results: Based on the presence and amount of detectable fat within the lesion, AMLs were classified in 4 distinct radiological patterns: Pattern-I, predominantly fatty (usually less than 2 cm in diameter and intrarenal): 54%; Pattern-II, partially fatty (intrarenal or exo phytic): 29%; Pattern-III, minimally fatty (most exo phytic and peri renal): 11%; and Pattern-IV, without fat (most exo phytic and peri renal): 6%. Conclusions: This proposed classification might be useful to understand the imaging manifestations of AMLs, their differential diagnosis and determine when further radiological evaluation would be necessary. Small (< 1.5 cm), pattern-I AMLs tend to be intra-renal, homogeneous and predominantly fatty. As they grow they tend to be partially or completely exo phytic and heterogeneous (patterns II and III). The rare pattern-IV AMLs, however, can be small or large, intra-renal or exo phytic but are always homogeneous and hyperdense mass. Since no renal cell carcinoma was found in our series, from an evidence-based practice, all renal mass with

  2. Radiological classification of renal angiomyolipomas based on 127 tumors

    International Nuclear Information System (INIS)

    Purpose: Demonstrate radiological findings of 127 angiomyolipomas (AMLs) and propose a classification based on the radiological evidence of fat. Materials And Methods: The imaging findings of 85 consecutive patients with AMLs: isolated (n = 73), multiple without tuberous sclerosis (TS) (n = 4) and multiple with TS (n = 8), were retrospectively reviewed. Eighteen AMLs (14%) presented with hemorrhage. All patients were submitted to a dedicated helical CT or magnetic resonance studies. All hemorrhagic and non-hemorrhagic lesions were grouped together since our objective was to analyze the presence of detectable fat. Out of 85 patients, 53 were monitored and 32 were treated surgically due to large perirenal component (n = 13), hemorrhage (n = 11) and impossibility of an adequate preoperative characterization (n = 8). There was not a case of renal cell carcinoma (RCC) with fat component in this group of patients. Results: Based on the presence and amount of detectable fat within the lesion, AMLs were classified in 4 distinct radiological patterns: Pattern-I, predominantly fatty (usually less than 2 cm in diameter and intrarenal): 54%; Pattern-II, partially fatty (intrarenal or exo phytic): 29%; Pattern-III, minimally fatty (most exo phytic and peri renal): 11%; and Pattern-IV, without fat (most exo phytic and peri renal): 6%. Conclusions: This proposed classification might be useful to understand the imaging manifestations of AMLs, their differential diagnosis and determine when further radiological evaluation would be necessary. Small (< 1.5 cm), pattern-I AMLs tend to be intra-renal, homogeneous and predominantly fatty. As they grow they tend to be partially or completely exo phytic and heterogeneous (patterns II and III). The rare pattern-IV AMLs, however, can be small or large, intra-renal or exo phytic but are always homogeneous and hyperdense mass. Since no renal cell carcinoma was found in our series, from an evidence-based practice, all renal mass with

  3. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  4. Role Oriented Test Case Generation for Agent Based System

    Directory of Open Access Journals (Sweden)

    N.Sivakumar

    2013-02-01

    Full Text Available Agent Oriented Software Engineering (AOSE is a rapidly developing area of research. Current research and development primarily focuses on the analysis, design and implementation of agent based software whereas testing is less prioritised. Software testing is an important and indispensable part of software development process. Test case generation is the primary step of any testing process which is followed by test execution and test evaluation. Test case generation is not an easy task but upon automating the test case generation process serves many advantages such as time saving, effort saving and more importantly reduces number of errors and faults. This paper investigates about generating test cases for testing agent based software. We propose a novel approach, which takes advantage of agent’s role as the basis for generating test cases. Role is an important mental attribute of an agent which is simply defined as the set of capabilities that an agent can perform. The main objective of this paper is to generate test cases from role diagram upon converting it to activity diagram.

  5. Classification and thermal history of petroleum based on light hydrocarbons

    Science.gov (United States)

    Thompson, K. F. M.

    1983-02-01

    Classifications of oils and kerogens are described. Two indices are employed, termed the Heptane and IsoheptaneValues, based on analyses of gasoline-range hydrocarbons. The indices assess degree of paraffinicity. and allow the definition of four types of oil: normal, mature, supermature, and biodegraded. The values of these indices measured in sediment extracts are a function of maximum attained temperature and of kerogen type. Aliphatic and aromatic kerogens are definable. Only the extracts of sediments bearing aliphatic kerogens having a specific thermal history are identical to the normal oils which form the largest group (41%) in the sample set. This group was evidently generated at subsurface temperatures of the order of 138°-149°C, (280°-300°F) defined under specific conditions of burial history. It is suggested that all other petroleums are transformation products of normal oils.

  6. MICROWAVE BASED CLASSIFICATION OF MATERIAL USING NEURAL NETWORK

    Directory of Open Access Journals (Sweden)

    Anil H. Soni

    2011-07-01

    Full Text Available Microwave radar has emerged as a useful tool in many remote sensing application including material classification, target detection and shape extraction. In this paper, we present method to classify material based on their dielectric characteristics. Microwave radar in X-band range is used for scanning the target made of various materials like Acrylic, Metal and Wood in free space. Depending on their respective electromagnetic property, reflections from each target are measured and radar image is obtained. Further various features such as Energy, Entropy, Normalized sum of image intensity and standard deviation etc. are extracted and fed to feedfor word multilayer perceptron classifier, which determines whether it is dielectric or non-dielectric (metallic. Results show good performance.

  7. About Classification Methods Based on Tensor Modelling for Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Salah Bourennane

    2010-03-01

    Full Text Available Denoising and Dimensionality Reduction (DR are key issue to improve the classifiers efficiency for Hyper spectral images (HSI. The multi-way Wiener filtering recently developed is used, Principal and independent component analysis (PCA; ICA and projection pursuit(PP approaches to DR have been investigated. These matrix algebra methods are applied on vectorized images. Thereof, the spatial rearrangement is lost. To jointly take advantage of the spatial and spectral information, HSI has been recently represented as tensor. Offering multiple ways to decompose data orthogonally, we introduced filtering and DR methods based on multilinear algebra tools. The DR is performed on spectral way using PCA, or PP joint to an orthogonal projection onto a lower subspace dimension of the spatial ways. Weshow the classification improvement using the introduced methods in function to existing methods. This experiment is exemplified using real-world HYDICE data. Multi-way filtering, Dimensionality reduction, matrix and multilinear algebra tools, tensor processing.

  8. Engineering Agent-Based Social Simulations: An Introduction

    OpenAIRE

    Peer-Olaf Siebers; Paul Davidsson

    2015-01-01

    This special section on "Engineering Agent-Based Social Simulations" aims to represent the current state of the art in using Software Engineering (SE) methods in ABSS. It includes a mixture of theoretically oriented papers that describe frameworks, notations and methods adapted from SE and practice-oriented papers that demonstrate the application of SE methods in real world ABSS projects.

  9. Multi-level agent-based modeling - Bibliography

    CERN Document Server

    Morvan, Gildas

    2012-01-01

    This very short article aims to bring together the available bibliography on multi-level (or multi-layer, multi-perspective, multi-view, multi-scale, multi-resolution) agent-based modeling so that it is accessible to interested researchers.

  10. Mobile Agent-Based Directed Diffusion in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Kwon Taekyoung

    2007-01-01

    Full Text Available In the environments where the source nodes are close to one another and generate a lot of sensory data traffic with redundancy, transmitting all sensory data by individual nodes not only wastes the scarce wireless bandwidth, but also consumes a lot of battery energy. Instead of each source node sending sensory data to its sink for aggregation (the so-called client/server computing, Qi et al. in 2003 proposed a mobile agent (MA-based distributed sensor network (MADSN for collaborative signal and information processing, which considerably reduces the sensory data traffic and query latency as well. However, MADSN is based on the assumption that the operation of mobile agent is only carried out within one hop in a clustering-based architecture. This paper considers MA in multihop environments and adopts directed diffusion (DD to dispatch MA. The gradient in DD gives a hint to efficiently forward the MA among target sensors. The mobile agent paradigm in combination with the DD framework is dubbed mobile agent-based directed diffusion (MADD. With appropriate parameters set, extensive simulation shows that MADD exhibits better performance than original DD (in the client/server paradigm in terms of packet delivery ratio, energy consumption, and end-to-end delivery latency.

  11. Complete agent based simulation of mini-grids

    OpenAIRE

    González de Durana García, José María; Barambones Caramazana, Oscar; Kremers, Enrique; Viejo, Pablo

    2009-01-01

    EuroPES 2009 With eyes focused on simulation we review some of the main topics of Hybrid Renewable Energy Systems (HRES). Then we describe an Agent Based model of a simple example of one of such systems, a micro-grid, oriented to designing a decentralized Supervisor Control. The model has been implemented using AnyLogic.

  12. Agent-based Personal Network (PN) service architecture

    DEFF Research Database (Denmark)

    Jiang, Bo; Olesen, Henning

    2004-01-01

    In this paper we proposte a new concept for a centralized agent system as the solution for the PN service architecture, which aims to efficiently control and manage the PN resources and enable the PN based services to run seamlessly over different networks and devices. The working principle...

  13. [Galaxy/quasar classification based on nearest neighbor method].

    Science.gov (United States)

    Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun

    2011-09-01

    With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification. PMID:22097877

  14. New classification system-based visual outcome in Eales′ disease

    Directory of Open Access Journals (Sweden)

    Saxena Sandeep

    2007-01-01

    Full Text Available Purpose: A retrospective tertiary care center-based study was undertaken to evaluate the visual outcome in Eales′ disease, based on a new classification system, for the first time. Materials and Methods: One hundred and fifty-nine consecutive cases of Eales′ disease were included. All the eyes were staged according to the new classification: Stage 1: periphlebitis of small (1a and large (1b caliber vessels with superficial retinal hemorrhages; Stage 2a: capillary non-perfusion, 2b: neovascularization elsewhere/of the disc; Stage 3a: fibrovascular proliferation, 3b: vitreous hemorrhage; Stage 4a: traction/combined rhegmatogenous retinal detachment and 4b: rubeosis iridis, neovascular glaucoma, complicated cataract and optic atrophy. Visual acuity was graded as: Grade I 20/20 or better; Grade II 20/30 to 20/40; Grade III 20/60 to 20/120 and Grade IV 20/200 or worse. All the cases were managed by medical therapy, photocoagulation and/or vitreoretinal surgery. Visual acuity was converted into decimal scale, denoting 20/20=1 and 20/800=0.01. Paired t-test / Wilcoxon signed-rank tests were used for statistical analysis. Results: Vitreous hemorrhage was the commonest presenting feature (49.32%. Cases with Stages 1 to 3 and 4a and 4b achieved final visual acuity ranging from 20/15 to 20/40; 20/80 to 20/400 and 20/200 to 20/400, respectively. Statistically significant improvement in visual acuities was observed in all the stages of the disease except Stages 1a and 4b. Conclusion: Significant improvement in visual acuities was observed in the majority of stages of Eales′ disease following treatment. This study adds further to the little available evidences of treatment effects in literature and may have effect on patient care and health policy in Eales′ disease.

  15. Quality-Oriented Classification of Aircraft Material Based on SVM

    Directory of Open Access Journals (Sweden)

    Hongxia Cai

    2014-01-01

    Full Text Available The existing material classification is proposed to improve the inventory management. However, different materials have the different quality-related attributes, especially in the aircraft industry. In order to reduce the cost without sacrificing the quality, we propose a quality-oriented material classification system considering the material quality character, Quality cost, and Quality influence. Analytic Hierarchy Process helps to make feature selection and classification decision. We use the improved Kraljic Portfolio Matrix to establish the three-dimensional classification model. The aircraft materials can be divided into eight types, including general type, key type, risk type, and leveraged type. Aiming to improve the classification accuracy of various materials, the algorithm of Support Vector Machine is introduced. Finally, we compare the SVM and BP neural network in the application. The results prove that the SVM algorithm is more efficient and accurate and the quality-oriented material classification is valuable.

  16. Cancer Data Clustering and Classification Based using Efnn_Pcamethod

    OpenAIRE

    J. Saranya; Hemalatha, R.

    2014-01-01

    One challenge area inside the studies of natural phenomenon data is that the classifications of the expression dataset into correct classes. The distinctive nature of the of Obtainable natural phenomenon data set is that the foremost challenge. massive vary of extraneous attributes (genes), challenge arises from the applying domain of cancer classification. though Accuracy plays a major think about cancer classification, the biological conation is another key criterion, as any biological data...

  17. BRAIN TUMOR CLASSIFICATION USING NEURAL NETWORK BASED METHODS

    OpenAIRE

    Kalyani A. Bhawar*, Prof. Nitin K. Bhil

    2016-01-01

    MRI (Magnetic resonance Imaging) brain neoplasm pictures Classification may be a troublesome tasks due to the variance and complexity of tumors. This paper presents two Neural Network techniques for the classification of the magnetic resonance human brain images. The proposed Neural Network technique consists of 3 stages, namely, feature extraction, dimensionality reduction, and classification. In the first stage, we have obtained the options connected with tomography pictures victimization d...

  18. Investigating text message classification using case-based reasoning

    OpenAIRE

    Healy, Matt, (Thesis)

    2007-01-01

    Text classification is the categorization of text into a predefined set of categories. Text classification is becoming increasingly important given the large volume of text stored electronically e.g. email, digital libraries and the World Wide Web (WWW). These documents represent a massive amount of information that can be accessed easily. To gain benefit from using this information requires organisation. One way of organising it automatically is to use text classification. A number of well k...

  19. The Geographic Information Grid System Based on Mobile Agent

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    We analyze the deficiencies of current application systems, and discuss the key requirements of distributed Geographic Information service (GIS). We construct the distributed GIS on grid platform. Considering the flexibility and efficiency, we integrate the mobile agent technology into the system. We propose a new prototype system, the Geographic Information Grid System (GIGS) based on mobile agent. This system has flexible services and high performance, and improves the sharing of distributed resources. The service strategy of the system and the examples are also presented.

  20. An Agent Based Model for Social Class Emergence

    Science.gov (United States)

    Yang, Xiaoxiang; Rodriguez Segura, Daniel; Lin, Fei; Mazilu, Irina

    We present an open system agent-based model to analyze the effects of education and the society-specific wealth transactions on the emergence of social classes. Building on previous studies, we use realistic functions to model how years of education affect the income level. Numerical simulations show that the fraction of an individual's total transactions that is invested rather than consumed can cause wealth gaps between different income brackets in the long run. In an attempt to incorporate the network effects, we also explore how the probability of interactions among agents depending on the spread of their income brackets affects wealth distribution.

  1. Agent-based Simulation of Processes in Medicine

    Czech Academy of Sciences Publication Activity Database

    Bošanský, Branislav

    Praha : Ústav informatiky AV ČR, v. v. i. & MATFYZPRESS, 2008 - (Hakl, F.), s. 19-27 ISBN 978-80-7378-054-8. [Doktorandské dny 2008 Ústavu informatiky AV ČR, v. v. i.. Jizerka (CZ), 29.09.2008-01.10.2008] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi - agent system s * agent -based simulation * process modelling * medicine Subject RIV: IN - Informatics, Computer Science

  2. SPY AGENT BASED SECURE DATA AGGREGATION IN WSN

    Directory of Open Access Journals (Sweden)

    T. Lathies Bhasker

    2014-12-01

    Full Text Available Wireless sensor network consist lot of sensor devices which are activated by using the battery power. These sensor devices are mostly used in hostile environment, military applications etc. So in this type of environment it is highly difficult to collect and transmit the data to the Sink without any data lost. In this paper we proposed SPY Agent based secure data aggregation scheme. Here one SPY Agent moves around the network and monitors the aggregator nodes i.e, the Cluster Heads for secure data collection. In the Simulation section we have analyzed our proposed architecture for both proactive and reactive protocols.

  3. Comparison and Analysis of Biological Agent Category Lists Based On Biosafety and Biodefense

    OpenAIRE

    Tian, Deqiao; Zheng, Tao

    2014-01-01

    Biological agents pose a serious threat to human health, economic development, social stability and even national security. The classification of biological agents is a basic requirement for both biosafety and biodefense. We compared and analyzed the Biological Agent Laboratory Biosafety Category list and the defining criteria according to the World Health Organization (WHO), the National Institutes of Health (NIH), the European Union (EU) and China. We also compared and analyzed the Biologic...

  4. Using Agents in Web-Based Constructivist Collaborative Learning System

    Institute of Scientific and Technical Information of China (English)

    刘莹; 林福宗; 王雪

    2004-01-01

    Web-based learning systems are one of the most interesting topics in the area of the application of computers to education. Collaborative learning, as an important principle in constructivist learning theory, is an important instruction mode for open and distance learning systems. Through collaborative learning, students can greatly improve their creativity, exploration capability, and social cooperation. This paper used an agent-based coordination mechanism to respond to the requirements of an efficient and motivating learning process. This coordination mechanism is based on a Web-based constructivist collaborative learning system, in which students can learn in groups and interact with each other by several kinds of communication modes to achieve their learning objectives efficiently and actively. In this learning system, artificial agents represent an active part in the collaborative learning process; they can partially replace human instructors during the multi-mode interaction of the students.

  5. Technology of structure damage monitoring based on multi-agent

    Institute of Scientific and Technical Information of China (English)

    Hongbing Sun; Shenfang Yuan; Xia Zhao; Hengbao Zhou; Dong Liang

    2010-01-01

    The health monitoring for large-scale structures need to resolve a large number of difficulties,such as the data transmission and distributing information handling.To solve these problems,the technology of multi-agent is a good candidate to be used in the field of structural health monitoring.A structural health monitoring system architecture based on multi-agent technology is proposed.The measurement system for aircraft airfoil is designed with FBG,strain gage,and corresponding signal processing circuit.The experiment to determine the location of the concentrate loading on the structure is carried on with the system combined with technologies of pattern recognition and multi-agent.The results show that the system can locate the concentrate loading of the aircraft airfoil at the accuracy of 91.2%.

  6. Differential Protection for Distributed Micro-Grid Based on Agent

    Directory of Open Access Journals (Sweden)

    ZHOU Bin

    2013-05-01

    Full Text Available The Micro-grid, even though not a replacement of the conventional centralized power transmission grid, plays a very important role in the success of rapid development of renewable energy resources technologies. Due to the facts of decentralization, independence and dynamic of sources within a Micro-grid, a high level automation of protection is a must. Multi-Agent system as a approach to handle distributed system issues has been developed. This paper presents an MAS based differential protection method for distributed micro-grid. The nodes within a micro-grid are divided into primary and backup protection zones. The agents follow predefined rules to take actions to protect the system and isolate the fault when it happens. Furthermore, an algorithm is proposed to achieve high availability in case of Agent itself malfunction. The method is using Matlab for simulation and shows it satisfies relay protection in terms of the selectivity, sensitivity, rapidity and reliability requirements.

  7. Multi-agent Based Hierarchy Simulation Models of Carrier-based Aircraft Catapult Launch

    Institute of Scientific and Technical Information of China (English)

    Wang Weijun; Qu Xiangju; Guo Linliang

    2008-01-01

    With the aid of multi-agent based modeling approach to complex systems,the hierarchy simulation models of carrier-based aircraft catapult launch are developed.Ocean,carrier,aircraft,and atmosphere are treated as aggregation agents,the detailed components like catapult,landing gears,and disturbances are considered as meta-agents,which belong to their aggregation agent.Thus,the model with two layers is formed i.e.the aggregation agent layer and the meta-agent layer.The information communication among all agents is described.The meta-agents within one aggregation agent communicate with each other directly by information sharing,but the meta-agents,which belong to different aggregation agents exchange their information through the aggregation layer fast,and then perceive it from the sharing environment,that is the aggregation agent.Thus,not only the hierarchy model is built,but also the environment perceived by each agent is specified.Meanwhile,the problem of balancing the independency of agent and the resource consumption brought by real-time communication within multi-agent system (MAS) is resolved.Each agent involved in carrier-based aircraft catapult launch is depicted,with considering the interaction within disturbed atmospheric environment and multiple motion bodies including carrier,aircraft,and landing gears.The models of reactive agents among them are derived based on tensors,and the perceived messages and inner frameworks of each agent are characterized.Finally,some results of a simulation instance are given.The simulation and modeling of dynamic system based on multi-agent system is of benefit to express physical concepts and logical hierarchy clearly and precisely.The system model can easily draw in kinds of other agents to achieve a precise simulation of more complex system.This modeling technique makes the complex integral dynamic equations of multibodies decompose into parallel operations of single agent,and it is convenient to expand,maintain,and reuse

  8. Review of Image Classification Techniques Based on LDA, PCA and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Mukul Yadav

    2014-02-01

    Full Text Available Image classification is play an important role in security surveillance in current scenario of huge Amount of image data base. Due to rapid change of feature content of image are major issues in classification. The image classification is improved by various authors using different model of classifier. The efficiency of classifier model depends on feature extraction process of traffic image. For the feature extraction process various authors used a different technique such as Gabor feature extraction, histogram and many more method on extraction process for classification. We apply the FLDA-GA for improved the classification rate of content based image classification. The improved method used heuristic function genetic algorithm. In the form of optimal GA used as feature optimizer for FLDA classification. The normal FLDA suffered from a problem of core and outlier problem. The both side kernel technique improved the classification process of support vector machine.FLDA perform a better classification in compression of another binary multi-class classification. Directed acyclic graph applied a graph portion technique for the mapping of feature data. The mapping space of feature data mapped correctly automatically improved the voting process of classification.

  9. Gd-HOPO Based High Relaxivity MRI Contrast Agents

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Ankona; Raymond, Kenneth

    2008-11-06

    Tris-bidentate HOPO-based ligands developed in our laboratory were designed to complement the coordination preferences of Gd{sup 3+}, especially its oxophilicity. The HOPO ligands provide a hexadentate coordination environment for Gd{sup 3+} in which all he donor atoms are oxygen. Because Gd{sup 3+} favors eight or nine coordination, this design provides two to three open sites for inner-sphere water molecules. These water molecules rapidly exchange with bulk solution, hence affecting the relaxation rates of bulk water olecules. The parameters affecting the efficiency of these contrast agents have been tuned to improve contrast while still maintaining a high thermodynamic stability for Gd{sup 3+} binding. The Gd- HOPO-based contrast agents surpass current commercially available agents ecause of a higher number of inner-sphere water molecules, rapid exchange of inner-sphere water molecules via an associative mechanism, and a long electronic relaxation time. The contrast enhancement provided by these agents is at least twice that of commercial contrast gents, which are based on polyaminocarboxylate ligands.

  10. Agent-Based Framework for Implementing and Deploying of SOA

    OpenAIRE

    Alexandru Butoi; Gabriela Andreea Morar; Andreea Ilea

    2012-01-01

    In distributed organizational and business information systems’ contexts, Service-Oriented Architectures (SOA) provide standard-based and protocol independent solutions. Despite the advances in SOA models and design methodologies, the implementation and deployment of service choreographies are still made in an un-unified manner using the existing tools. We present a three-layered framework model based on deployment agents, which allows designing and implementing service choreographies in a un...

  11. An Agent Based Simulation Of Smart Metering Technology Adoption

    OpenAIRE

    Zhang,Tao; Nuttall, William J.

    2007-01-01

    Based on the classic behavioural theory ?the Theory of Planned Behaviour?, we develop an agent-based model to simulate the diffusion of smart metering technology in the electricity market. We simulate the emergent adoption of smart metering technology under different management strategies and economic regulations. Our research results show that in terms of boosting the take-off of smart meters in the electricity market, choosing the initial users on a random and geographically dispersed basis...

  12. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    OpenAIRE

    Huibin Lu; Zhengping Hu; Hongxiao Gao

    2015-01-01

    In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the...

  13. Three-Phase Tournament-Based Method for Better Email Classification

    OpenAIRE

    Sabah Sayed; Samir AbdelRahman; Ibrahim Farag

    2012-01-01

    Email classification performance has attracted much attention in the last decades. This paper proposes a tournament-based method to evolve email classification performance utilizing World Final Cup rules as a solution heuristics. Our proposed classification method passes through three phases: 1) clustering (grouping) email folders (topics or classes) based on their token and field similarities, 2) training binary classifiers on each class pair and 3) applying 2-layer tournament me...

  14. Classification of types of stuttering symptoms based on brain activity.

    Directory of Open Access Journals (Sweden)

    Jing Jiang

    Full Text Available Among the non-fluencies seen in speech, some are more typical (MT of stuttering speakers, whereas others are less typical (LT and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT whole-word repetitions (WWR should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type.

  15. Simulating Interactive Learning Scenarios with Intelligent Pedagogical Agents in a Virtual World through BDI-Based Agents

    Directory of Open Access Journals (Sweden)

    Mohamed Soliman

    2013-04-01

    Full Text Available Intelligent Pedagogical Agents (IPAs are designed for pedagogical purposes to support learning in 3D virtual learning environments. Several benefits of IPAs have been found adding to support learning effectiveness. Pedagogical agents can be thought of as a central point of interaction between the learner and the learning environment. And hence, the intelligent behavior and functional richness of pedagogical agents have the potential to reward back into increased engagement and learning effectiveness. However, the realization of those agents remains to be a challenge based on intelligent agents in virtual worlds. This paper reports the challenging reasons and most importantly an approach for simplification. A simulation based on BDI agents is introduced opening the road for several extensions and experimentation before implementation of IPAs in a virtual world can take place. The simulation provides a proof-of concept based on three intelligent agents to represent an IPA, a learner, and learning object implemented in JACK and Jadex intelligent agent platforms. To that end, the paper exhibits the difficulties, resolutions, and decisions made when designing and implementing the learning scenario in both domains of the virtual world and the agent-based simulation while comparing the two agent platforms.

  16. Research and Application of Human Capital Strategic Classification Tool: Human Capital Classification Matrix Based on Biological Natural Attribute

    Directory of Open Access Journals (Sweden)

    Yong Liu

    2014-12-01

    Full Text Available In order to study the causes of weak human capital structure strategic classification management in China, we analyze that enterprises around the world face increasingly difficult for human capital management. In order to provide strategically sound answers, the HR managers need the critical information provided by the right technology processing and analytical tools. In this study, there are different types and levels of human capital in formal organization management, which is not the same contribution to a formal organization. An important guarantee for sustained and healthy development of the formal or informal organization is lower human capital risk. To resist this risk is primarily dependent on human capital hedge force and appreciation force in value, which is largely dependent on the strategic value of the performance of senior managers. Based on the analysis of high-level managers perspective, we also discuss the value and configuration of principles and methods to be followed in human capital strategic classification based on Boston Consulting Group (BCG matrix and build Human Capital Classification (HCC matrix based on biological natural attribute to effectively realize human capital structure strategic classification.

  17. [ECoG classification based on wavelet variance].

    Science.gov (United States)

    Yan, Shiyu; Liu, Chong; Wang, Hong; Zhao, Haibin

    2013-06-01

    For a typical electrocorticogram (ECoG)-based brain-computer interface (BCI) system in which the subject's task is to imagine movements of either the left small finger or the tongue, we proposed a feature extraction algorithm using wavelet variance. Firstly the definition and significance of wavelet variance were brought out and taken as feature based on the discussion of wavelet transform. Six channels with most distinctive features were selected from 64 channels for analysis. Consequently the EEG data were decomposed using db4 wavelet. The wavelet coeffi-cient variances containing Mu rhythm and Beta rhythm were taken out as features based on ERD/ERS phenomenon. The features were classified linearly with an algorithm of cross validation. The results of off-line analysis showed that high classification accuracies of 90. 24% and 93. 77% for training and test data set were achieved, the wavelet vari-ance had characteristics of simplicity and effectiveness and it was suitable for feature extraction in BCI research. K PMID:23865300

  18. Hyperspectral remote sensing image classification based on decision level fusion

    Institute of Scientific and Technical Information of China (English)

    Peijun Du; Wei Zhang; Junshi Xia

    2011-01-01

    @@ To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner.To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers.In the experiment, by using the operational modular imaging spectrometer (OMIS) II HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion.The results also indicate that the optimization of input features can improve the classification performance.%To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner. To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers. In the experiment, by using the operational modular imaging spectrometer (OMIS) Ⅱ HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion. The results also indicate that the optimization of input features can improve the classification performance.

  19. Text Classification Retrieval Based on Complex Network and ICA Algorithm

    Directory of Open Access Journals (Sweden)

    Hongxia Li

    2013-08-01

    Full Text Available With the development of computer science and information technology, the library is developing toward information and network. The library digital process converts the book into digital information. The high-quality preservation and management are achieved by computer technology as well as text classification techniques. It realizes knowledge appreciation. This paper introduces complex network theory in the text classification process and put forwards the ICA semantic clustering algorithm. It realizes the independent component analysis of complex network text classification. Through the ICA clustering algorithm of independent component, it realizes character words clustering extraction of text classification. The visualization of text retrieval is improved. Finally, we make a comparative analysis of collocation algorithm and ICA clustering algorithm through text classification and keyword search experiment. The paper gives the clustering degree of algorithm and accuracy figure. Through simulation analysis, we find that ICA clustering algorithm increases by 1.2% comparing with text classification clustering degree. Accuracy can be improved by 11.1% at most. It improves the efficiency and accuracy of text classification retrieval. It also provides a theoretical reference for text retrieval classification of eBook

  20. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  1. Simulation of convoy of unmanned vehicles using agent based modeling

    Science.gov (United States)

    Sharma, Sharad; Singh, Harpreet; Gerhart, G. R.

    2007-10-01

    There has been an increasing interest of unmanned vehicles keeping the importance of defense and security. A few models for a convoy of unmanned vehicle exist in literature. The objective of this paper is to exploit agent based modeling technique for a convoy of unmanned vehicles where each vehicle is an agent. Using this approach, the convoy of vehicles reaches a specified goal from a starting point. Each agent is associated with number of sensors. The agents make intelligent decisions based on sensor inputs and at the same time maintaining their group capability and behavior. The simulation is done for a battlefield environment from a single starting point to a single goal. This approach can be extended for multiple starting points to reach multiple goals. The simulation gives the time taken by the convoy to reach a goal from its initial position. In the battlefield environment, commanders make various tactical decisions depending upon the location of an enemy outpost, minefields, number of soldiers in platoons, and barriers. The simulation can help the commander to make effective decisions depending on battlefield, convoy and obstacles to reach a particular goal. The paper describes the proposed approach and gives the simulation results. The paper also gives problems for future research in this area.

  2. Mobile Arabchat: An Arabic Mobile-Based Conversational Agent

    Directory of Open Access Journals (Sweden)

    Mohammad Hijjawi

    2015-10-01

    Full Text Available The conversation automation/simulation between a user and machine evolved during the last years. A number of research-based systems known as conversational agents has been developed to address this challenge. A conversational Agent is a program that attempts to simulate conversations between the human and machine. Few of these programs targeted the mobile-based users to handle the conversations between them and a mobile device through an embodied spoken character. Wireless communication has been rapidly extended with the expansion of mobile services. Therefore, this paper discusses the proposing and developing a framework of a mobile-based conversational agent called Mobile ArabChat to handle the Arabic conversations between the Arab users and mobile device. To best of our knowledge, there are no such applications that address this challenge for Arab mobile-based users. An Android based application was developed in this paper, and it has been tested and evaluated in a large real environment. Evaluation results show that the Mobile ArabChat works properly, and there is a need for such a system for Arab users.

  3. Dynamic Agent Classification and Tracking Using an Ad Hoc Mobile Acoustic Sensor Network

    Science.gov (United States)

    Friedlander, David; Griffin, Christopher; Jacobson, Noah; Phoha, Shashi; Brooks, Richard R.

    2003-12-01

    Autonomous networks of sensor platforms can be designed to interact in dynamic and noisy environments to determine the occurrence of specified transient events that define the dynamic process of interest. For example, a sensor network may be used for battlefield surveillance with the purpose of detecting, identifying, and tracking enemy activity. When the number of nodes is large, human oversight and control of low-level operations is not feasible. Coordination and self-organization of multiple autonomous nodes is necessary to maintain connectivity and sensor coverage and to combine information for better understanding the dynamics of the environment. Resource conservation requires adaptive clustering in the vicinity of the event. This paper presents methods for dynamic distributed signal processing using an ad hoc mobile network of microsensors to detect, identify, and track targets in noisy environments. They seamlessly integrate data from fixed and mobile platforms and dynamically organize platforms into clusters to process local data along the trajectory of the targets. Local analysis of sensor data is used to determine a set of target attribute values and classify the target. Sensor data from a field test in the Marine base at Twentynine Palms, Calif, was analyzed using the techniques described in this paper. The results were compared to "ground truth" data obtained from GPS receivers on the vehicles.

  4. Dynamic Agent Classification and Tracking Using an Ad Hoc Mobile Acoustic Sensor Network

    Directory of Open Access Journals (Sweden)

    Friedlander David

    2003-01-01

    Full Text Available Autonomous networks of sensor platforms can be designed to interact in dynamic and noisy environments to determine the occurrence of specified transient events that define the dynamic process of interest. For example, a sensor network may be used for battlefield surveillance with the purpose of detecting, identifying, and tracking enemy activity. When the number of nodes is large, human oversight and control of low-level operations is not feasible. Coordination and self-organization of multiple autonomous nodes is necessary to maintain connectivity and sensor coverage and to combine information for better understanding the dynamics of the environment. Resource conservation requires adaptive clustering in the vicinity of the event. This paper presents methods for dynamic distributed signal processing using an ad hoc mobile network of microsensors to detect, identify, and track targets in noisy environments. They seamlessly integrate data from fixed and mobile platforms and dynamically organize platforms into clusters to process local data along the trajectory of the targets. Local analysis of sensor data is used to determine a set of target attribute values and classify the target. Sensor data from a field test in the Marine base at Twentynine Palms, Calif, was analyzed using the techniques described in this paper. The results were compared to "ground truth" data obtained from GPS receivers on the vehicles.

  5. Comprehensive Study on Lexicon-based Ensemble Classification Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Łukasz Augustyniak

    2015-12-01

    Full Text Available We propose a novel method for counting sentiment orientation that outperforms supervised learning approaches in time and memory complexity and is not statistically significantly different from them in accuracy. Our method consists of a novel approach to generating unigram, bigram and trigram lexicons. The proposed method, called frequentiment, is based on calculating the frequency of features (words in the document and averaging their impact on the sentiment score as opposed to documents that do not contain these features. Afterwards, we use ensemble classification to improve the overall accuracy of the method. What is important is that the frequentiment-based lexicons with sentiment threshold selection outperform other popular lexicons and some supervised learners, while being 3–5 times faster than the supervised approach. We compare 37 methods (lexicons, ensembles with lexicon’s predictions as input and supervised learners applied to 10 Amazon review data sets and provide the first statistical comparison of the sentiment annotation methods that include ensemble approaches. It is one of the most comprehensive comparisons of domain sentiment analysis in the literature.

  6. Classification of Histological Images Based on the Stationary Wavelet Transform

    International Nuclear Information System (INIS)

    Non-Hodgkin lymphomas are of many distinct types, and different classification systems make it difficult to diagnose them correctly. Many of these systems classify lymphomas only based on what they look like under a microscope. In 2008 the World Health Organisation (WHO) introduced the most recent system, which also considers the chromosome features of the lymphoma cells and the presence of certain proteins on their surface. The WHO system is the one that we apply in this work. Herewith we present an automatic method to classify histological images of three types of non-Hodgkin lymphoma. Our method is based on the Stationary Wavelet Transform (SWT), and it consists of three steps: 1) extracting sub-bands from the histological image through SWT, 2) applying Analysis of Variance (ANOVA) to clean noise and select the most relevant information, 3) classifying it by the Support Vector Machine (SVM) algorithm. The kernel types Linear, RBF and Polynomial were evaluated with our method applied to 210 images of lymphoma from the National Institute on Aging. We concluded that the following combination led to the most relevant results: detail sub-band, ANOVA and SVM with Linear and RBF kernels

  7. Ovarian Cancer Classification based on Mass Spectrometry Analysis of Sera

    Directory of Open Access Journals (Sweden)

    Baolin Wu

    2006-01-01

    Full Text Available In our previous study [1], we have compared the performance of a number of widely used discrimination methods for classifying ovarian cancer using Matrix Assisted Laser Desorption Ionization (MALDI mass spectrometry data on serum samples obtained from Reflectron mode. Our results demonstrate good performance with a random forest classifier. In this follow-up study, to improve the molecular classification power of the MALDI platform for ovarian cancer disease, we expanded the mass range of the MS data by adding data acquired in Linear mode and evaluated the resultant decrease in classification error. A general statistical framework is proposed to obtain unbiased classification error estimates and to analyze the effects of sample size and number of selected m/z features on classification errors. We also emphasize the importance of combining biological knowledge and statistical analysis to obtain both biologically and statistically sound results. Our study shows improvement in classification accuracy upon expanding the mass range of the analysis. In order to obtain the best classification accuracies possible, we found that a relatively large training sample size is needed to obviate the sample variations. For the ovarian MS dataset that is the focus of the current study, our results show that approximately 20-40 m/z features are needed to achieve the best classification accuracy from MALDI-MS analysis of sera. Supplementary information can be found at http://bioinformatics.med.yale.edu/proteomics/BioSupp2.html.

  8. An approach for mechanical fault classification based on generalized discriminant analysis

    Institute of Scientific and Technical Information of China (English)

    LI Wei-hua; SHI Tie-lin; YANG Shu-zi

    2006-01-01

    To deal with pattern classification of complicated mechanical faults,an approach to multi-faults classification based on generalized discriminant analysis is presented.Compared with linear discriminant analysis (LDA),generalized discriminant analysis (GDA),one of nonlinear discriminant analysis methods,is more suitable for classifying the linear non-separable problem.The connection and difference between KPCA (Kernel Principal Component Analysis) and GDA is discussed.KPCA is good at detection of machine abnormality while GDA performs well in multi-faults classification based on the collection of historical faults symptoms.When the proposed method is applied to air compressor condition classification and gear fault classification,an excellent performance in complicated multi-faults classification is presented.

  9. Model-Drive Architecture for Agent-Based Systems

    Science.gov (United States)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  10. Web-based supplier relationship framework using agent systems

    Institute of Scientific and Technical Information of China (English)

    Oboulhas Conrad Tsahat Onesime; XU Xiao-fei(徐晓飞); ZHAN De-chen(战德臣)

    2004-01-01

    In order to enable both manufacturers and suppliers to be profitable on today' s highly competitive markets, manufacturers and suppliers must be quick in selecting best partners establishing strategic relationship, and collaborating with each other so that they can satisfy the changing competitive manufacturing requirements. A web-based supplier relationships (SR) framework is therfore proposed using multi-agent systems and linear programming technique to reduce supply cost, increase flexibility and shorten response time. Web-based SR approach is an ideal platform for information exchange that helps buyers and suppliers to maintain the availability of materials in the right quantity, at the right place, and at the right time, and keep the customer-supplier relationship more transparent. A multi-agent system prototype was implemented by simulation, which shows the feasibility of the proposed architecture.

  11. Agent-based modelling of shifting cultivation field patterns, Vietnam

    DEFF Research Database (Denmark)

    Jepsen, Martin Rudbeck; Leisz, S.; Rasmussen, K.;

    2006-01-01

    modelling, and relying on empirical data from fieldwork and observations for parameterization of variables, the level of clustering in agricultural fields observed around a study village is reproduced. Agents in the model act to maximize labour productivity, which is based on potential yield and labour......Shifting cultivation in the Nghe An Province of Vietnam's Northern Mountain Region produces a characteristic land-cover pattern of small and larger fields. The pattern is the result of farmers cultivating either individually or in spatially clustered groups. Using spatially explicit agent-based...... costs associated with fencing of fields, and are faced with physical constraints. The simulation results are compared with land-cover data obtained from remote sensing. Comparisons are made on patterns as detected visually and using the mean nearest-neighbour ratio. Baseline simulation outputs show high...

  12. CORBA-Based Analysis of Multi Agent Behavior

    Institute of Scientific and Technical Information of China (English)

    Swapan Bhattacharya; Anirban Banerjee; Shibdas Bandyopadhyay

    2005-01-01

    An agent is a computer software that is capable of taking independent action on behalf of its user or owner. It is an entity with goals, actions and domain knowledge, situated in an environment. Multiagent systems comprises of multiple autonomous, interacting computer software, or agents. These systems can successfully emulate the entities active in a distributed environment. The analysis of multiagent behavior has been studied in this paper based on a specific board game problem similar to the famous problem of GO. In this paper a framework is developed to define the states of the multiagent entities and measure the convergence metrics for this problem. An analysis of the changes of states leading to the goal state is also made. We support our study of multiagent behavior by simulations based on a CORBA framework in order to substantiate our findings.

  13. Mobile Agent Based Framework for Integrating Digital Library System

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Few of the current approaches to achieve the integration of digital library system have considered the influence of network factors on quality of service for the integration system of digital libraries. For this reason, a mobile agent based framework for integrating digital library system is proposed. Based on this framework, a prototype system is implemented and the key technique for it are described. Compared with the current approaches, using mobile agent technique to achieve the integration of digital library system can not only avoid transmitting a lot of data on the network, lower the dependence on network bandwidth for the system, but also improve the quality of service for the integration system of digital libraries in intermitted or unreliable network connection settings.

  14. Propagation Modeling of Food Safety Crisis Information Update Based on the Multi-agent System

    OpenAIRE

    Meihong Wu; Jingfei Yang; Zhiling Hong

    2015-01-01

    This study propose a new multi-agent system frame based on epistemic default complex adaptive theory and use the agent based simulation and modeling the information updating process to study food safety crisis information dissemination. Then, we explore interaction effect between each agent in food safety crisis information dissemination at the current environment and mostly reveals how the government agent, food company agent and network media agent influence users confidence in food safety....

  15. DAIDS: a Distributed, Agent-based Information Dissemination System

    Directory of Open Access Journals (Sweden)

    Pete Haglich

    2007-10-01

    Full Text Available The Distributed Agent-Based Information Dissemination System (DAIDS concept was motivated by the need to share information among the members of a military tactical team in an atmosphere of extremely limited or intermittent bandwidth. The DAIDS approach recognizes that in many cases communications limitations will preclude the complete sharing of all tactical information between the members of the tactical team. Communications may be limited by obstructions to the line of sight between platforms; electronic warfare; or environmental conditions, or just contention from other users of that bandwidth. Since it may not be possible to achieve a complete information exchange, it is important to prioritize transmissions so the most critical information from the standpoint of the recipient is disseminated first. The challenge is to be able to determine which elements of information are the most important to each teammate. The key innovation of the DAIDS concept is the use of software proxy agents to represent the information needs of the recipient of the information. The DAIDS approach uses these proxy agents to evaluate the content of a message in accordance with the context and information needs of the recipient platform (the agent's principal and prioritize the message for dissemination. In our research we implemented this approach and demonstrated that it provides nearly a reduction in transmission times for critical tactical reports by up to a factor of 30 under severe bandwidth limitations.

  16. Endogenizing geopolitical boundaries with agent-based modeling

    Science.gov (United States)

    Cederman, Lars-Erik

    2002-01-01

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires “sociational” endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of “tagged” social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This “finite-agent method”, representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  17. Endogenizing geopolitical boundaries with agent-based modeling.

    Science.gov (United States)

    Cederman, Lars-Erik

    2002-05-14

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires "sociational" endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of "tagged" social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This "finite-agent method", representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  18. A NEW SVM BASED EMOTIONAL CLASSIFICATION OF IMAGE

    Institute of Scientific and Technical Information of China (English)

    Wang Weining; Yu Yinglin; Zhang Jianchao

    2005-01-01

    How high-level emotional representation of art paintings can be inferred from percep tual level features suited for the particular classes (dynamic vs. static classification)is presented. The key points are feature selection and classification. According to the strong relationship between notable lines of image and human sensations, a novel feature vector WLDLV (Weighted Line Direction-Length Vector) is proposed, which includes both orientation and length information of lines in an image. Classification is performed by SVM (Support Vector Machine) and images can be classified into dynamic and static. Experimental results demonstrate the effectiveness and superiority of the algorithm.

  19. Different Classification Algorithms Based on Arabic Text Classification: Feature Selection Comparative Study

    Directory of Open Access Journals (Sweden)

    Ghazi Raho

    2015-02-01

    Full Text Available Feature selection is necessary for effective text classification. Dataset preprocessing is essential to make upright result and effective performance. This paper investigates the effectiveness of using feature selection. In this paper we have been compared the performance between different classifiers in different situations using feature selection with stemming, and without stemming.Evaluation used a BBC Arabic dataset, different classification algorithms such as decision tree (D.T, K-nearest neighbors (KNN, Naïve Bayesian (NB method and Naïve Bayes Multinomial(NBM classifier were used. The experimental results are presented in term of precision, recall, F-Measures, accuracy and time to build model.

  20. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  1. Opinion transmission in organizations: an agent-based modeling approach

    OpenAIRE

    Rouchier, Juliette; Tubaro, Paola; Emery, Cécile

    2014-01-01

    This paper builds a theoretical framework to detect the conditions under which social influence enables persistence of a shared opinion among members of an organization over time, despite membership turnover. It develops agent-based simulations of opinion evolution in an advice network, whereby opinion is defined in the broad sense of shared understandings on a matter that is relevant for an organization’s activities, and on which members have some degree of discretion. We combine a micro-lev...

  2. From agent-based models to artificial economies

    OpenAIRE

    Teglio, Andrea

    2011-01-01

    The aim of this thesis is to propose and illustrate an alternative approach to economic modeling and policy design that is grounded in the innovative field of agent-based computational economics (ACE). The recent crisis pointed out the fundamental role played by macroeconomic policy design in order to preserve social welfare, and the consequent necessity of understanding the effects of coordinated policy measures on the economic system. Classic approaches to macroeconomic modeling, mainly rep...

  3. Operational-level naval planning using agent-based simulation

    OpenAIRE

    Ercetin, Askin.

    2001-01-01

    This thesis uses agent-based modeling techniques to develop a simulation of the operational-level naval planning process. The simulation serves as an initial exploratory laboratory for analyzing the consequences of the force allocation, force deployment, and force movement decisions made by operational-level naval commanders during times of conflict or crisis. This model will hopefully help decision-makers in gaining insight into the naval planning process and enable them to make more informe...

  4. Agent-based distributed hierarchical control of dc microgrid systems

    DEFF Research Database (Denmark)

    Meng, Lexuan; Vasquez, Juan Carlos; Guerrero, Josep M.;

    2014-01-01

    In order to enable distributed control and management for microgrids, this paper explores the application of information consensus and local decisionmaking methods formulating an agent based distributed hierarchical control system. A droop controlled paralleled DC/DC converter system is taken as ....... Standard genetic algorithm is applied in each local control system in order to search for a global optimum. Hardware-in-Loop simulation results are shown to demonstrate the effectiveness of the method....

  5. Multispace Behavioral Model for Face-Based Affective Social Agents

    OpenAIRE

    Ali Arya; Steve DiPaola

    2007-01-01

    This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces: knowledge, personality, and mood. These spaces control a lower-level geometry space that provides parameters at the facial feature level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowl...

  6. An Agent-based Framework for Speech Investigation

    OpenAIRE

    Walsh, Michael; O'Hare, G.M.P.; Carson-Berndsen, Julie

    2005-01-01

    This paper presents a novel agent-based framework for investigating speech recognition which combines statistical data and explicit phonological knowledge in order to explore strategies aimed at augmenting the performance of automatic speech recognition (ASR) systems. This line of research is motivated by a desire to provide solutions to some of the more notable problems encountered, including in particular the problematic phenomena of coarticulation, underspecified input...

  7. An Agent-Based Dialogical Model with Fuzzy Attitudes

    OpenAIRE

    Piter Dykstra; Wander Jager; Corinna Elsenbroich; Rineke Verbrugge; Gerard Renardel de Lavalette

    2015-01-01

    This paper presents an extension to an agent-based model of opinion dynamics built on dialogical logic DIAL. The extended model tackles a pervasive problem in argumentation logics: the difference between linguistic and logical inconsistency. Using fuzzy logic, the linear ordering of opinions, used in DIAL, is replaced by a set of partial orderings leading to a new, nonstandard notion of consistency as convexity of sets of statements. DIAL allows the modelling of the interplay of social struct...

  8. Online analysis and visualization of agent based models

    OpenAIRE

    Grignard, Arnaud; Drogoul, Alexis; Zucker, Jean-Daniel

    2013-01-01

    International audience Agent-based modeling is used to study many kind of complex systems in different fields such as biology, ecology, or sociology. Visualization of the execution of a such complex systems is crucial in the capacity to apprehend its dynamics. The ever increasing complexification of requirements asked by the modeller has highlighted the need for more powerful tools than the existing ones to represent, visualize and interact with a simulation and extract data online to disc...

  9. Agent Based Model of Young Researchers in Higher Education Institutions

    OpenAIRE

    Josip Stepanic; Mirjana Pejic Bach; Josip Kasac

    2013-01-01

    Group of young researchers in higher education institutions in general perform demandable tasks with relatively high contribution to institutions’ and societies’ innovation production. In order to analyse in more details interaction among young researchers and diverse institutions in society, we aim toward developing the numerical simulation, agent-based model. This article presents foundations of the model, preliminary results of its simulation along with perspectives of its further deve...

  10. AGENT-BASED NEGOTIATION PLATFORM IN COLLABORATIVE NETWORKED ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Adina-Georgeta CREȚAN

    2014-05-01

    Full Text Available This paper proposes an agent-based platform to model and support parallel and concurrent negotiations among organizations acting in the same industrial market. The underlying complexity is to model the dynamic environment where multi-attribute and multi-participant negotiations are racing over a set of heterogeneous resources. The metaphor Interaction Abstract Machines (IAMs is used to model the parallelism and the non-deterministic aspects of the negotiation processes that occur in Collaborative Networked Environment.

  11. Deep Learning in Agent-Based Models: A Prospectus

    OpenAIRE

    Hoog, Sander van der

    2016-01-01

    A very timely issue for economic agent-based models (ABMs) is their empirical estimation. This paper describes a line of research that could resolve the issue by using machine learning techniques, using multi-layer artificial neural networks (ANNs), or so called Deep Nets. The seminal contribution by Hinton et al. (2006) introduced a fast and efficient training algorithm called Deep Learning, and there have been major breakthroughs in machine learning ever since. Economics has not yet benefit...

  12. Investigating the feasibility of a BCI-driven robot-based writing agent for handicapped individuals

    Science.gov (United States)

    Syan, Chanan S.; Harnarinesingh, Randy E. S.; Beharry, Rishi

    2014-07-01

    Brain-Computer Interfaces (BCIs) predominantly employ output actuators such as virtual keyboards and wheelchair controllers to enable handicapped individuals to interact and communicate with their environment. However, BCI-based assistive technologies are limited in their application. There is minimal research geared towards granting disabled individuals the ability to communicate using written words. This is a drawback because involving a human attendant in writing tasks can entail a breach of personal privacy where the task entails sensitive and private information such as banking matters. BCI-driven robot-based writing however can provide a safeguard for user privacy where it is required. This study investigated the feasibility of a BCI-driven writing agent using the 3 degree-of- freedom Phantom Omnibot. A full alphanumerical English character set was developed and validated using a teach pendant program in MATLAB. The Omnibot was subsequently interfaced to a P300-based BCI. Three subjects utilised the BCI in the online context to communicate words to the writing robot over a Local Area Network (LAN). The average online letter-wise classification accuracy was 91.43%. The writing agent legibly constructed the communicated letters with minor errors in trajectory execution. The developed system therefore provided a feasible platform for BCI-based writing.

  13. Improving Sparse Representation-Based Classification Using Local Principal Component Analysis

    OpenAIRE

    Weaver, Chelsea; Saito, Naoki

    2016-01-01

    Sparse representation-based classification (SRC), proposed by Wright et al., seeks the sparsest decomposition of a test sample over the dictionary of training samples, with classification to the most-contributing class. Because it assumes test samples can be written as linear combinations of their same-class training samples, the success of SRC depends on the size and representativeness of the training set. Our proposed classification algorithm enlarges the training set by using local princip...

  14. Analysis on Design of Kohonen-network System Based on Classification of Complex Signals

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The key methods of detection and classification of the electroencephalogram(EEG) used in recent years are introduced . Taking EEG for example, the design plan of Kohonen neural network system based on detection and classification of complex signals is proposed, and both the network design and signal processing are analyzed, including pre-processing of signals, extraction of signal features, classification of signal and network topology, etc.

  15. Assessing the Performance of a Classification-Based Vulnerability Analysis Model

    OpenAIRE

    Wang, Tai-Ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-01-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the clas-sification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment ...

  16. Human Cancer Classification: A Systems Biology- Based Model Integrating Morphology, Cancer Stem Cells, Proteomics, and Genomics

    OpenAIRE

    Halliday A Idikio

    2011-01-01

    Human cancer classification is currently based on the idea of cell of origin, light and electron microscopic attributes of the cancer. What is not yet integrated into cancer classification are the functional attributes of these cancer cells. Recent innovative techniques in biology have provided a wealth of information on the genomic, transcriptomic and proteomic changes in cancer cells. The emergence of the concept of cancer stem cells needs to be included in a classification model to capture...

  17. Comparing Machine Learning Classifiers for Object-Based Land Cover Classification Using Very High Resolution Imagery

    OpenAIRE

    Yuguo Qian; Weiqi Zhou; Jingli Yan; Weifeng Li; Lijian Han

    2014-01-01

    This study evaluates and compares the performance of four machine learning classifiers—support vector machine (SVM), normal Bayes (NB), classification and regression tree (CART) and K nearest neighbor (KNN)—to classify very high resolution images, using an object-based classification procedure. In particular, we investigated how tuning parameters affect the classification accuracy with different training sample sizes. We found that: (1) SVM and NB were superior to CART and KNN, and both could...

  18. Trace elements based classification on clinkers. Application to Spanish clinkers

    Directory of Open Access Journals (Sweden)

    Tamás, F. D.

    2001-12-01

    Full Text Available The qualitative identification to determine the origin (i.e. manufacturing factory of Spanish clinkers is described. The classification of clinkers produced in different factories can be based on their trace element content. Approximately fifteen clinker sorts are analysed, collected from 11 Spanish cement factories to determine their Mg, Sr, Ba, Mn, Ti, Zr, Zn and V content. An expert system formulated by a binary decision tree is designed based on the collected data. The performance of the obtained classifier was measured by ten-fold cross validation. The results show that the proposed method is useful to identify an easy-to-use expert system that is able to determine the origin of the clinker based on its trace element content.

    En el presente trabajo se describe el procedimiento de identificación cualitativa de clínkeres españoles con el objeto de determinar su origen (fábrica. Esa clasificación de los clínkeres se basa en el contenido de sus elementos traza. Se analizaron 15 clínkeres diferentes procedentes de 11 fábricas de cemento españolas, determinándose los contenidos en Mg, Sr, Ba, Mn, Ti, Zr, Zn y V. Se ha diseñado un sistema experto mediante un árbol de decisión binario basado en los datos recogidos. La clasificación obtenida fue examinada mediante la validación cruzada de 10 valores. Los resultados obtenidos muestran que el modelo propuesto es válido para identificar, de manera fácil, un sistema experto capaz de determinar el origen de un clínker basándose en el contenido de sus elementos traza.

  19. China's Classification-Based Forest Management: Procedures, Problems, and Prospects

    Science.gov (United States)

    Dai, Limin; Zhao, Fuqiang; Shao, Guofan; Zhou, Li; Tang, Lina

    2009-06-01

    China’s new Classification-Based Forest Management (CFM) is a two-class system, including Commodity Forest (CoF) and Ecological Welfare Forest (EWF) lands, so named according to differences in their distinct functions and services. The purposes of CFM are to improve forestry economic systems, strengthen resource management in a market economy, ease the conflicts between wood demands and public welfare, and meet the diversified needs for forest services in China. The formative process of China’s CFM has involved a series of trials and revisions. China’s central government accelerated the reform of CFM in the year 2000 and completed the final version in 2003. CFM was implemented at the provincial level with the aid of subsidies from the central government. About a quarter of the forestland in China was approved as National EWF lands by the State Forestry Administration in 2006 and 2007. Logging is prohibited on National EWF lands, and their landowners or managers receive subsidies of about 70 RMB (US10) per hectare from the central government. CFM represents a new forestry strategy in China and its implementation inevitably faces challenges in promoting the understanding of forest ecological services, generalizing nationwide criteria for identifying EWF and CoF lands, setting up forest-specific compensation mechanisms for ecological benefits, enhancing the knowledge of administrators and the general public about CFM, and sustaining EWF lands under China’s current forestland tenure system. CFM does, however, offer a viable pathway toward sustainable forest management in China.

  20. Neural Network based Vehicle Classification for Intelligent Traffic Control

    Directory of Open Access Journals (Sweden)

    Saeid Fazli

    2012-06-01

    Full Text Available Nowadays, number of vehicles has been increased and traditional systems of traffic controlling couldn’t be able to meet the needs that cause to emergence of Intelligent Traffic Controlling Systems. They improve controlling and urban management and increase confidence index in roads and highways. The goal of thisarticle is vehicles classification base on neural networks. In this research, it has been used a immovable camera which is located in nearly close height of the road surface to detect and classify the vehicles. The algorithm that used is included two general phases; at first, we are obtaining mobile vehicles in the traffic situations by using some techniques included image processing and remove background of the images and performing edge detection and morphology operations. In the second phase, vehicles near the camera areselected and the specific features are processed and extracted. These features apply to the neural networks as a vector so the outputs determine type of vehicle. This presented model is able to classify the vehicles in three classes; heavy vehicles, light vehicles and motorcycles. Results demonstrate accuracy of the algorithm and its highly functional level.

  1. Brazilian Cardiorespiratory Fitness Classification Based on Maximum Oxygen Consumption

    Science.gov (United States)

    Herdy, Artur Haddad; Caixeta, Ananda

    2016-01-01

    Background Cardiopulmonary exercise test (CPET) is the most complete tool available to assess functional aerobic capacity (FAC). Maximum oxygen consumption (VO2 max), an important biomarker, reflects the real FAC. Objective To develop a cardiorespiratory fitness (CRF) classification based on VO2 max in a Brazilian sample of healthy and physically active individuals of both sexes. Methods We selected 2837 CEPT from 2837 individuals aged 15 to 74 years, distributed as follows: G1 (15 to 24); G2 (25 to 34); G3 (35 to 44); G4 (45 to 54); G5 (55 to 64) and G6 (65 to 74). Good CRF was the mean VO2 max obtained for each group, generating the following subclassification: Very Low (VL): VO2 105%. Results Men VL 105% G1 53.13 G2 49.77 G3 47.67 G4 42.52 G5 37.06 G6 31.50 Women G1 40.85 G2 40.01 G3 34.09 G4 32.66 G5 30.04 G6 26.36 Conclusions This chart stratifies VO2 max measured on a treadmill in a robust Brazilian sample and can be used as an alternative for the real functional evaluation of physically and healthy individuals stratified by age and sex. PMID:27305285

  2. Hadoop-based Multi-classification Fusion for Intrusion Detection

    OpenAIRE

    Xun-Yi Ren; Yu-Zhu Qi

    2013-01-01

    Intrusion detection system is the most important security technology in computer network, currently clustering and classification of data mining technology are often used to build detection model. However, different classification and clustering device has its own advantages and disadvantages and the testing result of detection model is not ideal. Cloud Computing, which can integrate multiple inexpensive computing nodes into a distributed system with a stro...

  3. Egocentric visual event classification with location-based priors

    OpenAIRE

    Sundaram, Sudeep; Mayol-Cuevas, Walterio

    2010-01-01

    We present a method for visual classification of actions and events captured from an egocentric point of view. The method tackles the challenge of a moving camera by creating deformable graph models for classification of actions. Action models are learned from low resolution, roughly stabilized difference images acquired using a single monocular camera. In parallel, raw images from the camera are used to estimate the user's location using a visual Simultaneous Localization and Mapping (SLAM) ...

  4. Basic Hand Gestures Classification Based on Surface Electromyography

    OpenAIRE

    Aleksander Palkowski; Grzegorz Redlarski

    2016-01-01

    This paper presents an innovative classification system for hand gestures using 2-channel surface electromyography analysis. The system developed uses the Support Vector Machine classifier, for which the kernel function and parameter optimisation are conducted additionally by the Cuckoo Search swarm algorithm. The system developed is compared with standard Support Vector Machine classifiers with various kernel functions. The average classification rate of 98.12% has been achieved for the prop...

  5. Consistent image-based measurement and classification of skin color

    OpenAIRE

    Harville, Michael; Baker, Harlyn; Bhatti, Nina; Süsstrunk, Sabine

    2005-01-01

    Little prior image processing work has addressed estimation and classification of skin color in a manner that is independent of camera and illuminant. To this end, we first present new methods for 1) fast, easy-to-use image color correction, with specialization toward skin tones, and 2) fully automated estimation of facial skin color, with robustness to shadows, specularities, and blemishes. Each of these is validated independently against ground truth, and then combined with a classification...

  6. Text Classification Retrieval Based on Complex Network and ICA Algorithm

    OpenAIRE

    Hongxia Li

    2013-01-01

    With the development of computer science and information technology, the library is developing toward information and network. The library digital process converts the book into digital information. The high-quality preservation and management are achieved by computer technology as well as text classification techniques. It realizes knowledge appreciation. This paper introduces complex network theory in the text classification process and put forwards the ICA semantic clustering algorithm. It...

  7. Texture Features based Blur Classification in Barcode Images

    OpenAIRE

    Shamik Tiwari; Vidya Prasad Shukla; Sangappa Birada; Ajay Singh

    2013-01-01

    Blur is an undesirable phenomenon which appears as image degradation. Blur classification is extremely desirable before application of any blur parameters estimation approach in case of blind restoration of barcode image. A novel approach to classify blur in motion, defocus, and co-existence of both blur categories is presented in this paper. The key idea involves statistical features extraction of blur pattern in frequency domain and designing of blur classification system with feed forward ...

  8. IMPROVEMENT OF TCAM-BASED PACKET CLASSIFICATION ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Xu Zhen; Zhang Jun; Rui Liyang; Sun Jun

    2008-01-01

    The feature of Ternary Content Addressable Memories (TCAMs) makes them particularly attractive for IP address lookup and packet classification applications in a router system. However, the limitations of TCAMs impede their utilization. In this paper, the solutions for decreasing the power consumption and avoiding entry expansion in range matching are addressed. Experimental results demonstrate that the proposed techniques can make some big improvements on the performance of TCAMs in IP address lookup and packet classification.

  9. Basic Hand Gestures Classification Based on Surface Electromyography.

    Science.gov (United States)

    Palkowski, Aleksander; Redlarski, Grzegorz

    2016-01-01

    This paper presents an innovative classification system for hand gestures using 2-channel surface electromyography analysis. The system developed uses the Support Vector Machine classifier, for which the kernel function and parameter optimisation are conducted additionally by the Cuckoo Search swarm algorithm. The system developed is compared with standard Support Vector Machine classifiers with various kernel functions. The average classification rate of 98.12% has been achieved for the proposed method. PMID:27298630

  10. Basic Hand Gestures Classification Based on Surface Electromyography

    Directory of Open Access Journals (Sweden)

    Aleksander Palkowski

    2016-01-01

    Full Text Available This paper presents an innovative classification system for hand gestures using 2-channel surface electromyography analysis. The system developed uses the Support Vector Machine classifier, for which the kernel function and parameter optimisation are conducted additionally by the Cuckoo Search swarm algorithm. The system developed is compared with standard Support Vector Machine classifiers with various kernel functions. The average classification rate of 98.12% has been achieved for the proposed method.

  11. Basic Hand Gestures Classification Based on Surface Electromyography

    Science.gov (United States)

    Palkowski, Aleksander; Redlarski, Grzegorz

    2016-01-01

    This paper presents an innovative classification system for hand gestures using 2-channel surface electromyography analysis. The system developed uses the Support Vector Machine classifier, for which the kernel function and parameter optimisation are conducted additionally by the Cuckoo Search swarm algorithm. The system developed is compared with standard Support Vector Machine classifiers with various kernel functions. The average classification rate of 98.12% has been achieved for the proposed method. PMID:27298630

  12. Graphene-based nanomaterials as molecular imaging agents.

    Science.gov (United States)

    Garg, Bhaskar; Sung, Chu-Hsun; Ling, Yong-Chien

    2015-01-01

    Molecular imaging (MI) is a noninvasive, real-time visualization of biochemical events at the cellular and molecular level within tissues, living cells, and/or intact objects that can be advantageously applied in the areas of diagnostics, therapeutics, drug discovery, and development in understanding the nanoscale reactions including enzymatic conversions and protein-protein interactions. Consequently, over the years, great advancement has been made in the development of a variety of MI agents such as peptides, aptamers, antibodies, and various nanomaterials (NMs) including single-walled carbon nanotubes. Recently, graphene, a material popularized by Geim & Novoselov, has ignited considerable research efforts to rationally design and execute a wide range of graphene-based NMs making them an attractive platform for developing highly sensitive MI agents. Owing to their exceptional physicochemical and biological properties combined with desirable surface engineering, graphene-based NMs offer stable and tunable visible emission, small hydrodynamic size, low toxicity, and high biocompatibility and thus have been explored for in vitro and in vivo imaging applications as a promising alternative of traditional imaging agents. This review begins by describing the intrinsic properties of graphene and the key MI modalities. After which, we provide an overview on the recent advances in the design and development as well as physicochemical properties of the different classes of graphene-based NMs (graphene-dye conjugates, graphene-antibody conjugates, graphene-nanoparticle composites, and graphene quantum dots) being used as MI agents for potential applications including theranostics. Finally, the major challenges and future directions in the field will be discussed. PMID:25857851

  13. Formalizing argument-based agent interaction in electronic institutions

    OpenAIRE

    Chesñevar, Carlos Iván

    2001-01-01

    During the last decade the notion of agent has gained acceptance within the AI community, mainly due to its adequacy to formalize complex environments. Agents can be thought as active software objects, which may be autonomous and able to perceive, reason, act, and interact with other agents. When agents interact with each other, a multi-agent system (MAS) arises.

  14. Markov chain aggregation for agent-based models

    CERN Document Server

    Banisch, Sven

    2016-01-01

    This self-contained text develops a Markov chain approach that makes the rigorous analysis of a class of microscopic models that specify the dynamics of complex systems at the individual level possible. It presents a general framework of aggregation in agent-based and related computational models, one which makes use of lumpability and information theory in order to link the micro and macro levels of observation. The starting point is a microscopic Markov chain description of the dynamical process in complete correspondence with the dynamical behavior of the agent-based model (ABM), which is obtained by considering the set of all possible agent configurations as the state space of a huge Markov chain. An explicit formal representation of a resulting “micro-chain” including microscopic transition rates is derived for a class of models by using the random mapping representation of a Markov process. The type of probability distribution used to implement the stochastic part of the model, which defines the upd...

  15. A Multi Agent Based Model for Airport Service Planning

    Directory of Open Access Journals (Sweden)

    W.H. Ip

    2010-09-01

    Full Text Available Aviation industry is highly dynamic and demanding in nature that time and safety are the two most important factors while one of the major sources of delay is aircraft on ground because of it complexity, a lot of machinery like vehicles are involved and lots of communication are involved. As one of the aircraft ground services providers in Hong Kong International Airport, China Aircraft Services Limited (CASL aims to increase competitiveness by better its service provided while minimizing cost is also needed. One of the ways is to optimize the number of maintenance vehicles allocated in order to minimize chance of delay and also operating costs. In the paper, an agent-based model is proposed for support decision making in vehicle allocation. The overview of the aircrafts ground services procedures is firstly mentioned with different optimization methods suggested by researchers. Then, the agent-based approach is introduced and in the latter part of report and a multi-agent system is built and proposed which is decision supportive for CASL in optimizing the maintenance vehicles' allocation. The application provides flexibility for inputting number of different kinds of vehicles, simulation duration and aircraft arrival rate in order to simulation different scenarios which occurs in HKIA.

  16. Analysis of phase transition points for a two-color agent-based model

    Science.gov (United States)

    Shin, J. K.; Jung, P. S.

    2013-04-01

    The agent-based model treated in the present study describes dynamics of two types of population in a gravity-like potential field. In previous studies, the model was known to exhibit various spatiotemporal patterns on two-dimensioanl lattice systems. However, the patterns were classified depending purely on eye observations, and the underlying dynamics of these patterns were not fully explored. It remained a question to be answered if these eye observation-based classifications could be confirmed by any analytical means. To pursue the question, we first suggest several analytic quantities, such as convergence time steps and reaction speed, to replace the eye observations. As a result, we show that a phase diagram can be reasonably drawn on the contour diagram of the time steps. In addition, we find a power-law scaling in the reaction speed, confirming that a phase transition really is involved there. Next, as a main part of the present study, we apply analytical methods to calculate two important phase transition points from the system. The results from the analytical approach agreed well with the numerically obtained phase transition points from the agent-based model. In general, the paper serves as an example study of estimating global phenomena of complex systems in terms of local parameters of the system.

  17. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    Science.gov (United States)

    di Clemente, Riccardo; Pietronero, Luciano

    2012-07-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  18. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    CERN Document Server

    Di Clemente, Riccardo; 10.1038/srep00532

    2012-01-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  19. Building Distributed Web GIS: A Mobile-Agent Based Approach

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The diversity of GISs and the wide-spread availability of WWWhave l e d to an increasing amount of research on integrating a variety of heterogeneous and autonomous GISs in a cooperative environment to construct a new generation o f GIS characterizing in open architecture, distributed computation, interoperabi lity, and extensibility. Our on-going research project MADGI S (Mobile Agent based Distributed Geographic Information System) is reported, in which we pro pose the architecture of MADGIS to meet the requirements of integrating distribu ted GIS applications under Internet environment. We first describe the architect ure of MADGIS, and detailed discussions focusing on the structure of client site , server site and mobile agent in MADGIS. Then we explore key techniques for MAD GIS implementation.

  20. Small Antimicrobial Agents Based on Acylated Reduced Amide Scaffold.

    Science.gov (United States)

    Teng, Peng; Huo, Da; Nimmagadda, Alekhya; Wu, Jianfeng; She, Fengyu; Su, Ma; Lin, Xiaoyang; Yan, Jiyu; Cao, Annie; Xi, Chuanwu; Hu, Yong; Cai, Jianfeng

    2016-09-01

    Prevalence of drug-resistant bacteria has emerged to be one of the greatest threats in the 21st century. Herein, we report the development of a series of small molecular antibacterial agents that are based on the acylated reduced amide scaffold. These molecules display good potency against a panel of multidrug-resistant Gram-positive and Gram-negative bacterial strains. Meanwhile, they also effectively inhibit the biofilm formation. Mechanistic studies suggest that these compounds kill bacteria by compromising bacterial membranes, a mechanism analogous to that of host-defense peptides (HDPs). The mechanism is further supported by the fact that the lead compounds do not induce resistance in MRSA bacteria even after 14 passages. Lastly, we also demonstrate that these molecules have therapeutic potential by preventing inflammation caused by MRSA induced pneumonia in a rat model. This class of compounds could lead to an appealing class of antibiotic agents combating drug-resistant bacterial strains. PMID:27526720

  1. Design of an Agent Based Traffic-Information Collection System

    Directory of Open Access Journals (Sweden)

    Kenedy Aliila Greyson

    2012-11-01

    Full Text Available Currently, traffic congestions are common events in road networks of main cities in developing countries. It has been observed that, the size of congestion increases year after year. For traffic congestion management to work efficiently, sufficiently and accurately information are needed. In this research we present an alternative method using agent technology to collect and manipulate data so as to be used in optimizing the vehicle flow within the road networks. The objective is to design an agent based system to provide sufficient and accurate information used in traffic flow management, and vehicle traffic congestion mitigation. The implementation approach is presented. The case study is a portion of the road network from the city of Dar es Salaam.

  2. A Novel Architecture of Agent based Crawling for OAI Resources

    Directory of Open Access Journals (Sweden)

    J.P.Gupta

    2010-07-01

    Full Text Available Nowadays, most of the search engines are competing to index as much of the Surface Web as possible with leaving a lurch at the OAI content (pdf documents, which holds a huge amount of information than surface web. In this paper, a novel framework for OAI-PMH based Crawler is being proposed that uses agents to extract the metadata about the OAI resources and store them in a repository which is later on queried through the OAI-PMH layer to generate the XML pages containing the metadata. These pages are further added to the search engines repository for indexing that makes in turn increases therelevancy of Search Engine. Agents are being used to parallelizethe whole process so that metadata extraction from multiple resources can be carried out simultaneously.

  3. Capacity Analysis for Parallel Runway through Agent-Based Simulation

    Directory of Open Access Journals (Sweden)

    Yang Peng

    2013-01-01

    Full Text Available Parallel runway is the mainstream structure of China hub airport, runway is often the bottleneck of an airport, and the evaluation of its capacity is of great importance to airport management. This study outlines a model, multiagent architecture, implementation approach, and software prototype of a simulation system for evaluating runway capacity. Agent Unified Modeling Language (AUML is applied to illustrate the inbound and departing procedure of planes and design the agent-based model. The model is evaluated experimentally, and the quality is studied in comparison with models, created by SIMMOD and Arena. The results seem to be highly efficient, so the method can be applied to parallel runway capacity evaluation and the model propose favorable flexibility and extensibility.

  4. Review of Remotely Sensed Imagery Classification Patterns Based on Object-oriented Image Analysis

    Institute of Scientific and Technical Information of China (English)

    LIU Yongxue; LI Manchun; MAO Liang; XU Feifei; HUANG Shuo

    2006-01-01

    With the wide use of high-resolution remotely sensed imagery, the object-oriented remotely sensed information classification pattern has been intensively studied. Starting with the definition of object-oriented remotely sensed information classification pattern and a literature review of related research progress, this paper sums up 4 developing phases of object-oriented classification pattern during the past 20 years. Then, we discuss the three aspects of methodology in detail, namely remotely sensed imagery segmentation, feature analysis and feature selection, and classification rule generation, through comparing them with remotely sensed information classification method based on per-pixel. At last, this paper presents several points that need to be paid attention to in the future studies on object-oriented RS information classification pattern: 1) developing robust and highly effective image segmentation algorithm for multi-spectral RS imagery; 2) improving the feature-set including edge, spatial-adjacent and temporal characteristics; 3) discussing the classification rule generation classifier based on the decision tree; 4) presenting evaluation methods for classification result by object-oriented classification pattern.

  5. CLASSIFICATION OF LiDAR DATA WITH POINT BASED CLASSIFICATION METHODS

    OpenAIRE

    N. Yastikli; Cetin, Z.

    2016-01-01

    LiDAR is one of the most effective systems for 3 dimensional (3D) data collection in wide areas. Nowadays, airborne LiDAR data is used frequently in various applications such as object extraction, 3D modelling, change detection and revision of maps with increasing point density and accuracy. The classification of the LiDAR points is the first step of LiDAR data processing chain and should be handled in proper way since the 3D city modelling, building extraction, DEM generation, etc. applicati...

  6. INDUS - a composition-based approach for rapid and accurate taxonomic classification of metagenomic sequences

    OpenAIRE

    Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Reddy, Rachamalla Maheedhar; Reddy, Chennareddy Venkata Siva Kumar; Singh, Nitin Kumar; Sharmila S Mande

    2011-01-01

    Background Taxonomic classification of metagenomic sequences is the first step in metagenomic analysis. Existing taxonomic classification approaches are of two types, similarity-based and composition-based. Similarity-based approaches, though accurate and specific, are extremely slow. Since, metagenomic projects generate millions of sequences, adopting similarity-based approaches becomes virtually infeasible for research groups having modest computational resources. In this study, we present ...

  7. Classification and Identification of Over-voltage Based on HHT and SVM

    Institute of Scientific and Technical Information of China (English)

    WANG Jing; YANG Qing; CHEN Lin; SIMA Wenxia

    2012-01-01

    This paper proposes an effective method for over-voltage classification based on the Hilbert-Huang transform(HHT) method.Hilbert-Huang transform method is composed of empirical mode decomposition(EMD) and Hilbert transform.Nine kinds of common power system over-voltages are calculated and analyzed by HHT.Based on the instantaneous amplitude spectrum,Hilbert marginal spectrum and Hilbert time-frequency spectrum,three kinds of over-voltage characteristic quantities are obtained.A hierarchical classification system is built based on HHT and support vector machine(SVM).This classification system is tested by 106 field over-voltage signals,and the average classification rate is 94.3%.This research shows that HHT is an effective time-frequency analysis algorithms in the application of over-voltage classification and identification.

  8. 78 FR 18252 - Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System...

    Science.gov (United States)

    2013-03-26

    ... Industry Classification System Based Federal Wage System Wage Surveys AGENCY: U. S. Office of Personnel... is issuing a proposed rule that would update the 2007 North American Industry Classification System..., the U.S. Office of Personnel Management (OPM) issued a final rule (73 FR 45853) to update the...

  9. 78 FR 58153 - Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System...

    Science.gov (United States)

    2013-09-23

    ... RIN 3206-AM78 Prevailing Rate Systems; North American Industry Classification System Based Federal... Industry Classification System (NAICS) codes currently used in Federal Wage System wage survey industry..., 2013, the U.S. Office of Personnel Management (OPM) issued a proposed rule (78 FR 18252) to update...

  10. Renoprotection and the Bardoxolone Methyl Story - Is This the Right Way Forward A Novel View of Renoprotection in CKD Trials: A New Classification Scheme for Renoprotective Agents

    Directory of Open Access Journals (Sweden)

    Macaulay Onuigbo

    2013-04-01

    Full Text Available In the June 2011 issue of the New England Journal of Medicine, the BEAM (Bardoxolone Methyl Treatment: Renal Function in CKD/Type 2 Diabetes trial investigators rekindled new interest and also some controversy regarding the concept of renoprotection and the role of renoprotective agents, when they reported significant increases in the mean estimated glomerular filtration rate (eGFR in diabetic chronic kidney disease (CKD patients with an eGFR of 20-45 ml/min/1.73 m2 of body surface area at enrollment who received the trial drug bardoxolone methyl versus placebo. Unfortunately, subsequent phase IIIb trials failed to show that the drug is a safe alternative renoprotective agent. Current renoprotection paradigms depend wholly and entirely on angiotensin blockade; however, these agents [angiotensin converting enzyme (ACE inhibitors and angiotensin receptor blockers (ARBs] have proved to be imperfect renoprotective agents. In this review, we examine the mechanistic limitations of the various previous randomized controlled trials on CKD renoprotection, including the paucity of veritable, elaborate and systematic assessment methods for the documentation and reporting of individual patient-level, drug-related adverse events. We review the evidence base for the presence of putative, multiple independent and unrelated pathogenetic mechanisms that drive (diabetic and non-diabetic CKD progression. Furthermore, we examine the validity, or lack thereof, of the hyped notion that the blockade of a single molecule (angiotensin II, which can only antagonize the angiotensin cascade, would veritably successfully, consistently and unfailingly deliver adequate and qualitative renoprotection results in (diabetic and non-diabetic CKD patients. We clearly posit that there is this overarching impetus to arrive at the inference that multiple, disparately diverse and independent pathways, including any veritable combination of the mechanisms that we examine in this review

  11. Multi-label literature classification based on the Gene Ontology graph

    Directory of Open Access Journals (Sweden)

    Lu Xinghua

    2008-12-01

    Full Text Available Abstract Background The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. Results In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Conclusion Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate

  12. Agent Based Trust Management in Distributed E-Business Environment

    Directory of Open Access Journals (Sweden)

    E.Sathiyamoorthy

    2010-02-01

    Full Text Available In e-business environment, Trust Management is an important factor that is necessary for alltransactions. The basic e-business requirements like non-reputation of both trustee and of trustier arefound to be problem arising due to lack of trust information. Many environments use relatively simplemechanism to calculate trust, for example, e-bay is a typical example for reputation based system built oncentralized model of trust. Several frameworks have been designed by researchers based on reputationmodels, but all these mechanisms failed in preventing users from producing false information whilemaking a reputation. Also sufficient information regarding the new users who have just started doingbusiness online is not available.To overcome the drawbacks of the existing system, a new trust management framework isproposed in this paper for a distributed e-business environment. The model offers the merits of previoustrust management systems based on Trusted third parties namely Policy-based and Reputation basedmodels. It also ensures trustworthy business transactions between the business entities and provides amore appropriate trust rating value calculated on the basis of mathematical model taking into accountthe feedback of peers and direct experience and access policy. The implementation using Agents is foundto be more efficient with respect to time and trust calculation when compared to the model which worksunder non Agent environment.

  13. Agent-based modelling of consumer energy choices

    Science.gov (United States)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  14. Using the Agent-Based Modeling in Economic Field

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-12-01

    Full Text Available The last ten years of the XX century has been the witnesses of the apparition of a new scientific field, which is usually defined as the study of “Complex adaptive systems”. This field, generic named Complexity Sciences, shares its subject, the general proprieties of complex systems across traditional disciplinary boundaries, with cybernetics and general systems theory. But the development of Complexity Sciences approaches is determined by the extensive use of Agent-Based-Models (ABM as a research tool and an emphasis on systems, such as markets, populations or ecologies, which are less integrated or “organized” than the ones, such as companies and economies, intensively studied by the traditional disciplines. For ABM, a complex system is a system of individual agents who have the freedom to act in ways that are not always totally predictable, and whose actions are interconnected such that one agent’s actions changes the context (environment for other agents. These are many examples of such complex systems: the stock market, the human body immune system, a business organization, an institution, a work-team, a family etc.

  15. Secure Mobile Agent based Information Gathering in Wireless Network

    Directory of Open Access Journals (Sweden)

    Ashish Kumar Srivastava

    2010-08-01

    Full Text Available Nowadays, everything is moving towards the wireless environment to bring the smartness to the society. In this situation, it is necessary to bring the smart technologies in the wireless environment. By considering this in mind, we concentrated to incorporate the mobile agent in the wireless environment to gather information. The problem with the mobile agent (multi hop mobile agent is the security issue in gathering information from number of remote hosts. To overcome this security issue, an 3-ID algorithm is available which will verify the integrity of the data as well as provide confidentiality to the data. But this algorithm requires moretime complexity for verification of the previously collected allinformation integrity. To optimize the verification time complexity, this 3-ID algorithm [9][10] is modified to verify only N, N/2, N/3 or N/4 previous host information based on the requirements. The experimental results in the wireless environment proves that the verification time of the integrity will obviously less when compare to its original model.

  16. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  17. AGENT-BASED DISTRIBUTION GRID OPERATION BASED ON A TRAFFIC LIGHT CONCEPT

    OpenAIRE

    Drayer, Elisabeth; Hegemann, Jan; Lazarus, Marc; Caire, Raphael; Braun, Martin

    2015-01-01

    Compared to a centralised grid operation management for the distribution grid, a distributed and decentralised agent-based operation has a lot of advantages, like scalability, modularity and robustness. We propose the concept for an agent-based distribution grid operation management based on a traffic light concept. Depending on the situation in the grid, the operation management can be in different modes, which define the way how the grid is operated.

  18. Algebraic classification of higher dimensional spacetimes based on null alignment

    CERN Document Server

    Ortaggio, Marcello; Pravdova, Alena

    2012-01-01

    We review recent developments and applications of the classification of the Weyl tensor in higher dimensional Lorentzian geometries. First, we discuss the general setup, i.e. main definitions and methods for the classification, some refinements and the generalized Newman-Penrose and Geroch-Held-Penrose formalisms. Next, we summarize general results, such as a partial extension of the Goldberg-Sachs theorem, characterization of spacetimes with vanishing (or constant) curvature invariants and the peeling behaviour in asymptotically flat spacetimes. Finally, we discuss certain invariantly defined families of metrics and their relation with the Weyl tensor classification, including: Kundt and Robinson-Trautman spacetimes; the Kerr-Schild ansatz in a constant-curvature background; purely electric and purely magnetic spacetimes; direct and (some) warped products; and geometries with certain symmetries. To conclude, some applications to quadratic gravity are also overviewed.

  19. A method for cloud detection and opacity classification based on ground based sky imagery

    Directory of Open Access Journals (Sweden)

    M. S. Ghonima

    2012-11-01

    Full Text Available Digital images of the sky obtained using a total sky imager (TSI are classified pixel by pixel into clear sky, optically thin and optically thick clouds. A new classification algorithm was developed that compares the pixel red-blue ratio (RBR to the RBR of a clear sky library (CSL generated from images captured on clear days. The difference, rather than the ratio, between pixel RBR and CSL RBR resulted in more accurate cloud classification. High correlation between TSI image RBR and aerosol optical depth (AOD measured by an AERONET photometer was observed and motivated the addition of a haze correction factor (HCF to the classification model to account for variations in AOD. Thresholds for clear and thick clouds were chosen based on a training image set and validated with set of manually annotated images. Misclassifications of clear and thick clouds into the opposite category were less than 1%. Thin clouds were classified with an accuracy of 60%. Accurate cloud detection and opacity classification techniques will improve the accuracy of short-term solar power forecasting.

  20. A method for cloud detection and opacity classification based on ground based sky imagery

    Directory of Open Access Journals (Sweden)

    M. S. Ghonima

    2012-07-01

    Full Text Available Digital images of the sky obtained using a total sky imager (TSI are classified pixel by pixel into clear sky, optically thin and optically thick clouds. A new classification algorithm was developed that compares the pixel red-blue ratio (RBR to the RBR of a clear sky library (CSL generated from images captured on clear days. The difference, rather than the ratio, between pixel RBR and CSL RBR resulted in more accurate cloud classification. High correlation between TSI image RBR and aerosol optical depth (AOD measured by an AERONET photometer was observed and motivated the addition of a haze correction factor (HCF to the classification model to account for variations in AOD. Thresholds for clear and thick clouds were chosen based on a training image set and validated with set of manually annotated images. Misclassifications of clear and thick clouds into the opposite category were less than 1%. Thin clouds were classified with an accuracy of 60%. Accurate cloud detection and opacity classification techniques will improve the accuracy of short-term solar power forecasting.

  1. Gender Classification Method Based on Gait Energy Motion Derived from Silhouette Through Wavelet Analysis of Human Gait Moving Pictures

    OpenAIRE

    Kohei Arai; Rosa Andrie Asmara

    2014-01-01

    Gender classification method based on Gait Energy Motion: GEM derived through wavelet analysis of human gait moving pictures is proposed. Through experiments with human gait moving pictures, it is found that the extracted features of wavelet coefficients using silhouettes images are useful for improvement of gender classification accuracy. Also, it is found that the proposed gender classification method shows the best classification performance, 97.63% of correct classification ratio.

  2. Gender Classification Method Based on Gait Energy Motion Derived from Silhouette Through Wavelet Analysis of Human Gait Moving Pictures

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2014-02-01

    Full Text Available Gender classification method based on Gait Energy Motion: GEM derived through wavelet analysis of human gait moving pictures is proposed. Through experiments with human gait moving pictures, it is found that the extracted features of wavelet coefficients using silhouettes images are useful for improvement of gender classification accuracy. Also, it is found that the proposed gender classification method shows the best classification performance, 97.63% of correct classification ratio.

  3. A Multi-Label Classification Approach Based on Correlations Among Labels

    Directory of Open Access Journals (Sweden)

    Raed Alazaidah

    2015-02-01

    Full Text Available Multi label classification is concerned with learning from a set of instances that are associated with a set of labels, that is, an instance could be associated with multiple labels at the same time. This task occurs frequently in application areas like text categorization, multimedia classification, bioinformatics, protein function classification and semantic scene classification. Current multi-label classification methods could be divided into two categories. The first is called problem transformation methods, which transform multi-label classification problem into single label classification problem, and then apply any single label classifier to solve the problem. The second category is called algorithm adaptation methods, which adapt an existing single label classification algorithm to handle multi-label data. In this paper, we propose a multi-label classification approach based on correlations among labels that use both problem transformation methods and algorithm adaptation methods. The approach begins with transforming multi-label dataset into a single label dataset using least frequent label criteria, and then applies the PART algorithm on the transformed dataset. The output of the approach is multi-labels rules. The approach also tries to get benefit from positive correlations among labels using predictive Apriori algorithm. The proposed approach has been evaluated using two multi-label datasets named (Emotions and Yeast and three evaluation measures (Accuracy, Hamming Loss, and Harmonic Mean. The experiments showed that the proposed approach has a fair accuracy in comparison to other related methods.

  4. Seafloor Sediment Classification Based on Multibeam Sonar Data

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xinghua; CHEN Yongqi

    2004-01-01

    The multibeam sonars can provide hydrographic quality depth data as well as hold the potential to provide calibrated measurements of the seafloor acoustic backscattering strength. There has been much interest in utilizing backscatters and images from multibeam sonar for seabed type identification and most results are obtained. This paper has presented a focused review of several main methods and recent developments of seafloor classification utilizing multibeam sonar data or/and images. These are including the power spectral analysis methods, the texture analysis, traditional Bayesian classification theory and the most active neural network approaches.

  5. Woven fabric defects detection based on texture classification algorithm

    International Nuclear Information System (INIS)

    In this paper we have compared two famous methods in texture classification to solve the problem of recognition and classification of defects occurring in a textile manufacture. We have compared local binary patterns method with co-occurrence matrix. The classifier used is the support vector machines (SVM). The system has been tested using TILDA database. The results obtained are interesting and show that LBP is a good method for the problems of recognition and classifcation defects, it gives a good running time especially for the real time applications.

  6. Topic Modelling for Object-Based Classification of Vhr Satellite Images Based on Multiscale Segmentations

    Science.gov (United States)

    Shen, Li; Wu, Linmei; Li, Zhipeng

    2016-06-01

    Multiscale segmentation is a key prerequisite step for object-based classification methods. However, it is often not possible to determine a sole optimal scale for the image to be classified because in many cases different geo-objects and even an identical geo-object may appear at different scales in one image. In this paper, an object-based classification method based on mutliscale segmentation results in the framework of topic modelling is proposed to classify VHR satellite images in an entirely unsupervised fashion. In the stage of topic modelling, grayscale histogram distributions for each geo-object class and each segment are learned in an unsupervised manner from multiscale segments. In the stage of classification, each segment is allocated a geo-object class label by the similarity comparison between the grayscale histogram distributions of each segment and each geo-object class. Experimental results show that the proposed method can perform better than the traditional methods based on topic modelling.

  7. Segmentation-Based PolSAR Image Classification Using Visual Features: RHLBP and Color Features

    Directory of Open Access Journals (Sweden)

    Jian Cheng

    2015-05-01

    Full Text Available A segmentation-based fully-polarimetric synthetic aperture radar (PolSAR image classification method that incorporates texture features and color features is designed and implemented. This method is based on the framework that conjunctively uses statistical region merging (SRM for segmentation and support vector machine (SVM for classification. In the segmentation step, we propose an improved local binary pattern (LBP operator named the regional homogeneity local binary pattern (RHLBP to guarantee the regional homogeneity in PolSAR images. In the classification step, the color features extracted from false color images are applied to improve the classification accuracy. The RHLBP operator and color features can provide discriminative information to separate those pixels and regions with similar polarimetric features, which are from different classes. Extensive experimental comparison results with conventional methods on L-band PolSAR data demonstrate the effectiveness of our proposed method for PolSAR image classification.

  8. On infrastructure network design with agent-based modelling

    OpenAIRE

    Chappin, E.J.L.; Heijnen, P.W.

    2014-01-01

    We have developed an agent-based model to optimize green-field network design in an industrial area. We aim to capture some of the deep uncertainties surrounding infrastructure design by modelling it developing specific ant colony optimizations. Hence, we propose a variety of extensions to our existing work, first ideas on how to realize them and three cases to explicate our ideas. One case is the design of a CO2 pipeline network in Rotterdam industrial area. First simulation results have sho...

  9. Ontology-based, multi-agent support of production management

    Science.gov (United States)

    Meridou, Despina T.; Inden, Udo; Rückemann, Claus-Peter; Patrikakis, Charalampos Z.; Kaklamani, Dimitra-Theodora I.; Venieris, Iakovos S.

    2016-06-01

    Over the recent years, the reported incidents on failed aircraft ramp-ups or the delayed production in small-lots have increased substantially. In this paper, we present a production management platform that combines agent-based techniques with the Service Oriented Architecture paradigm. This platform takes advantage of the functionality offered by the semantic web language OWL, which allows the users and services of the platform to speak a common language and, at the same time, facilitates risk management and decision making.

  10. Metathesis access to monocyclic iminocyclitol-based therapeutic agents

    Directory of Open Access Journals (Sweden)

    Albert Demonceau

    2011-05-01

    Full Text Available By focusing on recent developments on natural and non-natural azasugars (iminocyclitols, this review bolsters the case for the role of olefin metathesis reactions (RCM, CM as key transformations in the multistep syntheses of pyrrolidine-, piperidine- and azepane-based iminocyclitols, as important therapeutic agents against a range of common diseases and as tools for studying metabolic disorders. Considerable improvements brought about by introduction of one or more metathesis steps are outlined, with emphasis on the exquisite steric control and atom-economical outcome of the overall process. The comparative performance of several established metathesis catalysts is also highlighted.

  11. Agent-based simulation of electricity markets. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Sensfuss, F.; Ragwitz, M. [Fraunhofer-Institut fuer Systemtechnik und Innovationsforschung (ISI), Karlsruhe (Germany); Genoese, M.; Moest, D. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Industriebetriebslehre und Industrielle Produktion

    2007-07-01

    Liberalisation, climate policy and promotion of renewable energy are challenges to players of the electricity sector in many countries. Policy makers have to con-sider issues like market power, bounded rationality of players and the appear-ance of fluctuating energy sources in order to provide adequate legislation. Fur-thermore the interactions between markets and environmental policy instru-ments become an issue of increasing importance. A promising approach for the scientific analysis of these developments is the field of agent-based simulation. The goal of this article is to provide an overview of the current work applying this methodology to the analysis of electricity markets. (orig.)

  12. A Novel Architecture of Agent based Crawling for OAI Resources

    OpenAIRE

    J. P. Gupta; Shruti Sharma

    2010-01-01

    Nowadays, most of the search engines are competing to index as much of the Surface Web as possible with leaving a lurch at the OAI content (pdf documents), which holds a huge amount of information than surface web. In this paper, a novel framework for OAI-PMH based Crawler is being proposed that uses agents to extract the metadata about the OAI resources and store them in a repository which is later on queried through the OAI-PMH layer to generate the XML pages containing the metadata. These ...

  13. Agent-based Algorithm for Spatial Distribution of Objects

    KAUST Repository

    Collier, Nathan

    2012-06-02

    In this paper we present an agent-based algorithm for the spatial distribution of objects. The algorithm is a generalization of the bubble mesh algorithm, initially created for the point insertion stage of the meshing process of the finite element method. The bubble mesh algorithm treats objects in space as bubbles, which repel and attract each other. The dynamics of each bubble are approximated by solving a series of ordinary differential equations. We present numerical results for a meshing application as well as a graph visualization application.

  14. Open source, web-based machine-learning assisted classification system

    OpenAIRE

    Consarnau Pallarés, Mireia Roser

    2016-01-01

    The aim of this article is to provide a design overview of the web based machine learning assisted multi-user classification system. The design is based on open source standards both for multi-user environment written in PHP using the Laravel framework and a Python based machine learning toolkit, Scikit-Learn. The advantage of the proposed system is that it does not require the domain specific knowledge or programming skills. Machine learning classification tasks are done on the background...

  15. Distributed Mo del Predictive Control Based on Multi-agent Mo del for Electric Multiple Units

    Institute of Scientific and Technical Information of China (English)

    LI Zhong-Qi; YANG Hui; ZHANG Kun-Peng; FU Ya-Ting

    2014-01-01

    The distributed-power electric multiple units (EMUs) are widely used in high-speed railway. Due to the structural characteristic of mutual-coupled power units in EMUs, each power unit is set as an agent. Combining with the traction/brake characteristic curve and running data of EMUs, a subtractive clustering method and pattern classification algorithm are adopted to set up a multi-model set for every agent. Then, the multi-agent model is established according to the multi-agent network topology and mutual-coupled constraint relations. Finally, we adopt a smooth start switching control strategy and a multi-agent distributed coordination control algorithm to ensure the synchronous speed tracking control of each agent. Simulation results on the actual CRH380A running data show the effectiveness of the proposed approach.

  16. Complexity and agent-based modelling in urban research

    DEFF Research Database (Denmark)

    Fertner, Christian

    influence on the bigger system. Traditional scientific methods or theories often tried to simplify, not accounting complex relations of actors and decision-making. The introduction of computers in simulation made new approaches in modelling, as for example agent-based modelling (ABM), possible, dealing...... with issues of complexity. Also in urban research, computer simulation is becoming popular for more and more issues, aiming at a new understanding of urban systems. The essay is based on some recent articles as well as some relevant websites. Due to the use of ABM in many scientific fields and the relevance......Urbanisation processes are results of a broad variety of actors or actor groups and their behaviour and decisions based on different experiences, knowledge, resources, values etc. The decisions done are often on a micro/individual level but resulting in macro/collective behaviour. In urban research...

  17. Analysis of uncertainty in multi-temporal object-based classification

    Science.gov (United States)

    Löw, Fabian; Knöfel, Patrick; Conrad, Christopher

    2015-07-01

    Agricultural management increasingly uses crop maps based on classification of remotely sensed data. However, classification errors can translate to errors in model outputs, for instance agricultural production monitoring (yield, water demand) or crop acreage calculation. Hence, knowledge on the spatial variability of the classier performance is important information for the user. But this is not provided by traditional assessments of accuracy, which are based on the confusion matrix. In this study, classification uncertainty was analyzed, based on the support vector machines (SVM) algorithm. SVM was applied to multi-spectral time series data of RapidEye from different agricultural landscapes and years. Entropy was calculated as a measure of classification uncertainty, based on the per-object class membership estimations from the SVM algorithm. Permuting all possible combinations of available images allowed investigating the impact of the image acquisition frequency and timing, respectively, on the classification uncertainty. Results show that multi-temporal datasets decrease classification uncertainty for different crops compared to single data sets, but there was no "one-image-combination-fits-all" solution. The number and acquisition timing of the images, for which a decrease in uncertainty could be realized, proved to be specific to a given landscape, and for each crop they differed across different landscapes. For some crops, an increase of uncertainty was observed when increasing the quantity of images, even if classification accuracy was improved. Random forest regression was employed to investigate the impact of different explanatory variables on the observed spatial pattern of classification uncertainty. It was strongly influenced by factors related with the agricultural management and training sample density. Lower uncertainties were revealed for fields close to rivers or irrigation canals. This study demonstrates that classification uncertainty estimates

  18. Hydrologic-Process-Based Soil Texture Classifications for Improved Visualization of Landscape Function.

    Directory of Open Access Journals (Sweden)

    Derek G Groenendyk

    Full Text Available Soils lie at the interface between the atmosphere and the subsurface and are a key component that control ecosystem services, food production, and many other processes at the Earth's surface. There is a long-established convention for identifying and mapping soils by texture. These readily available, georeferenced soil maps and databases are used widely in environmental sciences. Here, we show that these traditional soil classifications can be inappropriate, contributing to bias and uncertainty in applications from slope stability to water resource management. We suggest a new approach to soil classification, with a detailed example from the science of hydrology. Hydrologic simulations based on common meteorological conditions were performed using HYDRUS-1D, spanning textures identified by the United States Department of Agriculture soil texture triangle. We consider these common conditions to be: drainage from saturation, infiltration onto a drained soil, and combined infiltration and drainage events. Using a k-means clustering algorithm, we created soil classifications based on the modeled hydrologic responses of these soils. The hydrologic-process-based classifications were compared to those based on soil texture and a single hydraulic property, Ks. Differences in classifications based on hydrologic response versus soil texture demonstrate that traditional soil texture classification is a poor predictor of hydrologic response. We then developed a QGIS plugin to construct soil maps combining a classification with georeferenced soil data from the Natural Resource Conservation Service. The spatial patterns of hydrologic response were more immediately informative, much simpler, and less ambiguous, for use in applications ranging from trafficability to irrigation management to flood control. The ease with which hydrologic-process-based classifications can be made, along with the improved quantitative predictions of soil responses and visualization

  19. Hydrologic-Process-Based Soil Texture Classifications for Improved Visualization of Landscape Function.

    Science.gov (United States)

    Groenendyk, Derek G; Ferré, Ty P A; Thorp, Kelly R; Rice, Amy K

    2015-01-01

    Soils lie at the interface between the atmosphere and the subsurface and are a key component that control ecosystem services, food production, and many other processes at the Earth's surface. There is a long-established convention for identifying and mapping soils by texture. These readily available, georeferenced soil maps and databases are used widely in environmental sciences. Here, we show that these traditional soil classifications can be inappropriate, contributing to bias and uncertainty in applications from slope stability to water resource management. We suggest a new approach to soil classification, with a detailed example from the science of hydrology. Hydrologic simulations based on common meteorological conditions were performed using HYDRUS-1D, spanning textures identified by the United States Department of Agriculture soil texture triangle. We consider these common conditions to be: drainage from saturation, infiltration onto a drained soil, and combined infiltration and drainage events. Using a k-means clustering algorithm, we created soil classifications based on the modeled hydrologic responses of these soils. The hydrologic-process-based classifications were compared to those based on soil texture and a single hydraulic property, Ks. Differences in classifications based on hydrologic response versus soil texture demonstrate that traditional soil texture classification is a poor predictor of hydrologic response. We then developed a QGIS plugin to construct soil maps combining a classification with georeferenced soil data from the Natural Resource Conservation Service. The spatial patterns of hydrologic response were more immediately informative, much simpler, and less ambiguous, for use in applications ranging from trafficability to irrigation management to flood control. The ease with which hydrologic-process-based classifications can be made, along with the improved quantitative predictions of soil responses and visualization of landscape

  20. Emotion of Physiological Signals Classification Based on TS Feature Selection

    Institute of Scientific and Technical Information of China (English)

    Wang Yujing; Mo Jianlin

    2015-01-01

    This paper propose a method of TS-MLP about emotion recognition of physiological signal.It can recognize emotion successfully by Tabu search which selects features of emotion’s physiological signals and multilayer perceptron that is used to classify emotion.Simulation shows that it has achieved good emotion classification performance.